The U.S. Can Still Promote Democracy in Africa

America’s democracy, once seen as a shining light and inspiration to democrats across the world, was pushed to the brink by Donald Trump’s presidency. In the aftermath of last month’s storming of the Capitol by right-wing extremists, some commentators declared that the United States’ own troubles mean it must now back away from promoting liberal values in the rest of the world. But in fact, the opposite is true: Having repelled a major challenge to its own democracy, America is now better positioned to promote democratic norms and values abroad.

Check Also

Arming Against China: The US Global Posture Review

Get the Marines ready. Store the supplies. Marshal the allies. The United States is getting …