I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
Christians are so right about everything they got destroyed by the left in a single generation and most people all want nothing to do with them. Such power!
Christians were right about what they warned about. You're fallaciously translating being right into having power