I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
Too many Christian zionists. Too many Christians simping for israel and jews. Israel wouldn't have gotten so powerful if Christians hadn't been helping them along the way.
Also some Christians are unable to see the importance of race and think that as long as someone converts then they are fine.
The Churches in Japan are very pro refugee. https://files.catbox.moe/h9fpo7.jpg.
There was a Orthodox Christian guy in Japan called Chiune Sugihara who went against orders in ww2 in order to give Visas to bring thousands of jewish refugees to Japan even though he had been told numerous times by his bosses not to.
Christianity ignores race completely and converting everyone is what Christianity wants to achieve. If you put your religion above your own family/people/race/culture you'll inevitably end up where we are today.
And yet 'where we are today' did not happen in the 2000 years of Christianity, but only when Christianity fell.
What is your definition of 'when Christianity fell'? All this began during a time when the US was probably the most religious country on Earth.
And we can also talk about how beneficial Christianity was to the people during those 2000 years. Y'know with all the religious wars, the deeply corrupt clergy and all the other nice things that happened. So let's not pretend that Christianity stopped any bad shit from happening.
Eh... not quite. I have serious doubts about US religiosity. It may be religious compared to Europe, but not compared to the world. Furthermore, all this nonsense originated in the least Christian areas.
Just how many religious wars are you thinking of in these 2000 years to talk about 'all the religious wars'?
We're talking about 2000 years and you suggest that what can be said of some clergy in several decades in the 16th century was somehow the universal experience.
Well no, I wasn't arguing that Christianity led to a heaven on earth where nothing bad ever happened. Some of the wiser aspects of Christianity are not even the theological aspects, but the practical ones: a heaven on earth is impossible and attempts to create one lead to tragedy (see the French and Bolshevik revolutions), humans should not play god (see transgenderism), humans are by their nature fallen and imperfect (which is why you don't give absolute power to an individual).
I'll give you the Zionism but Christianity literally calls for ignoring race. Its actually one of the big points of contention with said zionists. Jesus abolished the idea of Jewish supremacy
So basically the church positions are too liberal?