I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
I'll give you the Zionism but Christianity literally calls for ignoring race. Its actually one of the big points of contention with said zionists. Jesus abolished the idea of Jewish supremacy