I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
And the left are loosing across the board thanks to that attitude.
As I said, it invites rebellion, weirdly more than just going 'do that shit and I'll shoot you'.