I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
Christians promote the following: All men are created equal. All men are brothers. Genetics doesn't exist. The world is flat. Violence doesn't work. Hate the sin, not the sinner. Missions are to help Africans not Whites. Just two more millennia. Send prayers not bullets. Technology is bad. The Earth is 6500 years old. Faggots can get married. Integration, Immigration and Misgenation.
The only thing they got right: "By their fruits you shall know them.", or however that saying goes. My pastor/scoutmaster raised a pyromaniac and a aichmomaniac, while at the same time telling me that I would never become an eagle scout on his watch. Jokes on him. Neither of his sons graduated high school because they couldn't follow basic rules, like not lighting the chemistry labs entire magnesium supply on fire, and not bringing knives to school.
It also applies to how every single christian nation is fighting for gay pride parades in Ukraine and Uganda. Christianity is leftism, because it shares the same foundation: the Tabula Rasa and the NAP.
It is funny how so many Christians do not believe in genetics or evolution, when they are perfectly compatible with the bible.
People will talk all day about the artificial divisions the Elite and Media push on us to keep us fighting and distracted, and then fall for the same game.
Thomas Jefferson was a deist. Christians promote the idea that all men are equal in the eyes of God, which means that you're not inherently better or worse based on who you are, but only by your actions.
Though this must bother a white nationalist.
[The Crusades have entered the conversation.]
Uganda is a Christian nation.