I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
Thomas Jefferson was a deist. Christians promote the idea that all men are equal in the eyes of God, which means that you're not inherently better or worse based on who you are, but only by your actions.
Though this must bother a white nationalist.
[The Crusades have entered the conversation.]
Uganda is a Christian nation.