The issue of deadly racism in the USA has been brought to the attention of …
Read More »What Happened to “the West”?
As America Drifts Away From Its Allies, a Less Peaceful World Awaits It has become commonplace to speak of living in a “post-Western world.” Commentators typically invoke the phrase to herald the emergence of non-Western powers—most obviously China, but also Brazil, India, Indonesia, Turkey, and the Gulf states, among others. …
Read More »