I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
It is funny how so many Christians do not believe in genetics or evolution, when they are perfectly compatible with the bible.
People will talk all day about the artificial divisions the Elite and Media push on us to keep us fighting and distracted, and then fall for the same game.