The U.S. Can Still Promote Democracy in Africa

America’s democracy, once seen as a shining light and inspiration to democrats across the world, was pushed to the brink by Donald Trump’s presidency. In the aftermath of last month’s storming of the Capitol by right-wing extremists, some commentators declared that the United States’ own troubles mean it must now back away from promoting liberal values in the rest of the world. But in fact, the opposite is true: Having repelled a major challenge to its own democracy, America is now better positioned to promote democratic norms and values abroad.

Check Also

Russian Offensive Campaign Assessment, November 18, 2024

Russian officials continued to use threatening rhetoric as part of efforts to deter the United …