The West, in short, was Christendom. But Christendom died. If you live in the West now, you are living among its ruins. Many of them are still beautiful — intact cathedrals, Bach concertos — but they are ruins nonetheless. And when an old culture built around a sacred order dies, there will be lasting upheaval at every level of society, from the level of politics to the level of the soul. The shape of everything — family, work, moral attitudes, the very existence of morals at all, notions of good and evil, sexual mores, perspectives on everything from money to rest to work to nature to the body to kin to duty — all of it will be up for grabs. Welcome to 2021.
You're viewing a single comment thread. View all comments, or full comment thread.
Comments (24)
sorted by:
My question, to both the author and the people here, is what is the West? Is it all of Europe? And the former European colonies? Because up until the fall of the Iron Curtain if you asked someone to define the West, they would probably define it as the democratic countries in western Europe and their former colonies like the U.S. and Australia. If it is something European and Christian, then how do you describe pre-Christian Europe, or even the pre-Christian parts of Europe like the Germanic tribes during the Roman Empire. Would they be considered Western?
I have no doubt at all that Christianity played an important role in shaping Western culture but I don't think that it is necessarily a defining characteristic. Because you can have the West without Christianity and you can have Christianity without being Western.