I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
You don't get that choice and you have already seen that abandoning Christianity results in hyper woke. The slippery slope is there and we're on it right now
Christianity in wokism aren't yin yang. the absence of one does not automatically mean the other.
Christianity lost popularity with the kids in the 2000s largely thanks to the internet at a time when it really was a wild west of ideas. the powers simply recognize that Christianity wasn't going to have a hold on the new generation, so pivoted to rainbow communism. In both cases it's all been bread and circuses to control the masses.