I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
There's a long rant in my brain somewhere about this, but alot of current Christian teaching basically boils down to 'We really, really, really need to keep men from being violent'.
When you start taking Christian teaching in context, you begin to see a trend where violence is very permissible in the right circumstances. Or, put another way, murder is bad. Killing, however, is very much on the table...
Remember, when people ask "what would Jesus do?" Flipping over some tables, making a whip, and beating the shit out of people breaking God's laws is on the table.