I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
Your methodology, you want to know WHY the Christians lost their foothold in the west culturally and allowed the left to takeover? Because you got too preachy with a 'we're doing this for your own good' attitude.
That just invites rebellion, especially from the younger generations and would you look at that, the left ALWAYS puts themselves as the 'rebels' and underdogs.
This recent one in regards to fanservice is making Christians seem the otherside of the coin to the leftist moralisers, so you're kneecapping yourselves than learning from previous mistakes.
Yet if you'd listened you'd be far better off