I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
Which Christians? You are talking about Protestantism, while Catholicism and Orthodoxy have their own sins.
Also, the Evangelicals that are being referenced... Well, modern evangelicalism (as per Why Men Hate Going to Church) was founded by "former" feminists. Hence the barely concealed hate on men.