I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
You know there's only two options with how a religion evolves over time as if you don't adapt then you end up like the Greek/Roman pantheon:
(Most popular) claim the otherside is heretical and try to kill them
Have a reformation where you have key figures lead to a change based on the inadequacies of current structure. This is what led to the Catholicism/Protestant split in Christianity
What Christianity is suffering from is the fact that it's very institutions are corrupt and rather than focus on cleaning up internally whether that's the Vatican or church of England etc, it tries to claim moral authority in culture against targets that are fictional.
They began losing when they tried to appeal to 'a modern audience' than promoting the virtues they have maintained for centuries.
Here we agree.