I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
Remember, when people ask "what would Jesus do?" Flipping over some tables, making a whip, and beating the shit out of people breaking God's laws is on the table.