The West, in short, was Christendom. But Christendom died. If you live in the West now, you are living among its ruins. Many of them are still beautiful — intact cathedrals, Bach concertos — but they are ruins nonetheless. And when an old culture built around a sacred order dies, there will be lasting upheaval at every level of society, from the level of politics to the level of the soul. The shape of everything — family, work, moral attitudes, the very existence of morals at all, notions of good and evil, sexual mores, perspectives on everything from money to rest to work to nature to the body to kin to duty — all of it will be up for grabs. Welcome to 2021.
Comments (24)
sorted by:
Christendom didn't die, but the West did divorce her... Divorce is violent.
Yea my pastor used to tell us that the church is currently experiencing a soft persecution. Nothing like what’s going on in Africa or Asia but it will ratchet up. One of the first things is going to be punishment for not agreeing with lgbt dogma.
Churches already being burned down in Canada and France.
True. I had forgotten that. I go to a Lutheran church and it’s funny because we are growing the fastest in Asia and Africa but declining rapidly here and in Europe
We're declining badly in Australia too sadly.
Not to mention ANTIFA been threatening to firebomb Catholic churches in Portland.
What happened to Notre Dame still angers me. That should have been a flashpoint for unbridled retaliation.
Like how Roman ruins are all over Europe?
My question, to both the author and the people here, is what is the West? Is it all of Europe? And the former European colonies? Because up until the fall of the Iron Curtain if you asked someone to define the West, they would probably define it as the democratic countries in western Europe and their former colonies like the U.S. and Australia. If it is something European and Christian, then how do you describe pre-Christian Europe, or even the pre-Christian parts of Europe like the Germanic tribes during the Roman Empire. Would they be considered Western?
I have no doubt at all that Christianity played an important role in shaping Western culture but I don't think that it is necessarily a defining characteristic. Because you can have the West without Christianity and you can have Christianity without being Western.
Bullllllllllshit. Christians were a subversive slave cult of middle eastern camel fuckers. With rumors of cannibalism. You know because t he y were poor slaves that celebrate weakness, meanness, and getting on your knees for authority.
Rome, Greece, The West is founded on merit, strength, and a republic with an informed elite ruling class that has stakes in the game. Christianity is one of t he subversive forces that doomed western civilization. A cuck factory religion.
No the western civilisation first peaked in Rome and Ancient Greece, neither were Christian.
Christianity is just tradcuck bullshit.
The problem is the female vote, that was the biggest mistake.
The West wasn't always Christian. The West will be fine without religion, they just have to defeat the SJWs and their communist friends.
I always thought the same, but that was based on religion being supplanted by rationality. I see, far too late, how hopelessly naive that was. Christianity has been replaced with a new religion, progressivism, with its destructive tenets of intersectionality, wokeness, sexual depravity etc... There is plenty I still don't like about Christianity, but it's tough to argue that its core tenets aren't a good basis for an enduring and successful civilization. I suppose, somewhat ironically, it really is a case of better the devil you know.
I agree that the new progressive religion is wrong and unethical, but is Christianity the answer? Something that cannot be proven? The Japanese aren't religious and they are not crumbing because of wokeness or any SJW nonsense.
"tradcuckery built the West and other lies we get told to make us not hate their enablers."
You are literally retarded man. "Tradcucks" did build the West. Traditions get ensconced into society because they were useful.
You argue for the extinction of man because the modern world has forsaken tradition. Muh women are gonna ruin everything, but how dare you want to return to the time when they didn't have the right to vote, when their job was to support the family via homebuilding.
Most of humanity's greatest people didn't have an insignificant other.
No but the builders who built their houses did. The tailors who made their clothes did. The merchants who sold them their food did. So did the printers who copied their novels and treatises and the workmen who put their ideas into practice.
None of those great minds would have achieved anything without the physical and social infrastructure that enabled them to do so. That infrastructure is rooted in community, which in turn is rooted in religion and family.
Most men, who aren't geniuses, are motivated primarily by their desire to provide for their families. It's always been that way, and it always will be.
Not in the West you dunce.
You may not like tradcuckery, but it did build the West.
Europe had religion before Christianity, so remove that pillar.