I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
when there's so many other way bigger problems in society , Christians bitching about hot female characters in games just makes them look like weak petty assholes who only want to bully the underdog because going against the other bigger problems would be too hard for them.