The U.S. Can Still Promote Democracy in Africa

America’s democracy, once seen as a shining light and inspiration to democrats across the world, was pushed to the brink by Donald Trump’s presidency. In the aftermath of last month’s storming of the Capitol by right-wing extremists, some commentators declared that the United States’ own troubles mean it must now back away from promoting liberal values in the rest of the world. But in fact, the opposite is true: Having repelled a major challenge to its own democracy, America is now better positioned to promote democratic norms and values abroad.

Check Also

Hopes and Uncertainties in Syria

Many Western leaders have expressed their relief at the collapse of the dictatorship of Syria’s …