I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
Thinking that authority should be derived from a book that contradicts itself and has undergone many translations and "fixes" over the centuries. It's not applicable to all Christians, but there's some attitude about it being authoritative over everything in life to the point that they try to force it on others. I could spin up a long list about all the cases of "why would God do this?" and argumentum ab Deus, but that book generates a perfect excuse in each instance for a Supreme Being whom should warrant no excuses.
Granted, there are a lot of good morals in the book, however, running your world to the letter of a book written by Men (that's been around far shorter than Man's existence) is dangerous. Now, cultural marxists replace it with their own shifting rules and clergy, having to follow the shifting agenda "precisely". We see it as a cult, but to me it bares some resemblance.