I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
This drives up to no end how so many Christians were taught the wrong version of this. If you were in a captive situation, you mock your captor by showing the ineffectiveness of his slap. Not that you let yourself become captive of an aggressor in the first place.
There's a long rant in my brain somewhere about this, but alot of current Christian teaching basically boils down to 'We really, really, really need to keep men from being violent'.
When you start taking Christian teaching in context, you begin to see a trend where violence is very permissible in the right circumstances. Or, put another way, murder is bad. Killing, however, is very much on the table...
Remember, when people ask "what would Jesus do?" Flipping over some tables, making a whip, and beating the shit out of people breaking God's laws is on the table.
I have had this argument several time irl. Enabling your own oppression and persecution isnt the same as persevering under it