I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
Christians have been gay af since they stopped moshing after the 90s. Back in the day you'd go to a show and there would be a whole crew, dozen or more, christian kids tearing it up in the pit. When I suggest to christians now to get that cooking again, they react like faggots. Listen to Zao and stop being fags.
Definitely this is not true of Christianity in general and also being anti- gay is literally for embarrassed gay men.