I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
I will give you two things they are wrong about: turning the other cheek, and treating forgiveness as a virtue.
^ guy who has never read the bible and just believes what jewish people told him about it