I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
Can't they engage in moral activism without pushing feminist talking points? Can't they find influencers that didn't just "find God" when they needed a beta cuck to pay for their years on the CC? That's a scam the manosphere has been calling out for over a decade. It would be easier to assume good faith if they weren't doing that shit. Their complaints about attractive female characters in games might even be somewhat understandable (at least considering their worldview) if they weren't blatantly ignoring the bigger fish we have to fry.
This is the one that grinds my gears the most. There isn't one notable female Christian influencer who wasn't a whore just a few years ago.