I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
A pattern of overrepresantation that is very easy to find.
Pedophilia violates canon law aswell. Unfortunately it doesn't stop priests from molesting kids or the Church and even Christian communities from protecting those priests. Collecting taxes to live in mindboggling luxury is also not something that Jesus taught if I recall correctly.
So you are saying Pagans and non-Catholic Christians were actually tolerated under Christian rule and weren't persecuted for heresy?
Christianity is political. That's inherent to the three monotheistic abrahamic religions. The only reason Christianity isn't acting more like Islam is because Christianity/the Church was neutered by the resistance of the European peoples. It's because Europeans are inherently different than Arabs.
The Church was for most its existence a political instution first and a spiritual one second. It has been and still is politically very active.
What made Luther different is that he acted in a way that couldn't be ignored. The Church didn't want to clean it up its act which is why they choose violence first before they realized they had no other choice than to adapt lest they inevitably loose their priviliged status. They weren't misguided. They knew exactly what they were doing.
Yes. Even Catholics are marrying homosexuals now. Christian institutions are pushing actively for globohomo bullshit. Some more than others.
Is it misguided though or just actively practicing on Christian values?