Hal Brands, Columnist

What Does ‘the West’ Really Mean Any More?

Biden is trying to re-engage the coalition of democracies that won the Cold War, but that’s probably not going work against China.

Empirical.

Photographer: Steve Parsons/Getty Images

Lock
This article is for subscribers only.

President Joe Biden’s recent trip to Europe had a Cold War feel: It featured meetings with close democratic allies followed by a tense summit with an authoritarian adversary. So it is fitting that a Cold War-era concept — “the West” — is experiencing a renewal. The Biden administration has “an opportunity to reinvigorate the West,” the organizers of the Munich Security Conference — a high-profile gathering of the transatlantic policy elite — declared this year.

Yet if the notion of the West implies a revival of democratic cooperation that is badly needed today, it may actually be too narrow a definition for the challenges the U.S. and its allies face.