I have been seeing alot of grumbling about Christians and complaining that Christian moral activism is returning.
Name a single part of social justice/woke culture that did not result from what Christians were warning yall about.
I guess some lessons will never be learned though. Society has just gone too far down the path of liberalism 🤷♂️
Interesting. I've seen a similar argument that humans are worms are more similar than humans and god, because both are creations. But it still requires an argument from the universally accepted doctrine, that is, I would not say that it is obvious or implied.
Very interesting. I don't think this actually raises it to the level of Christian doctrine, or widely accepted, but this is honestly much more than I expected.
First of all, props for using Wiki the same way I do - with some shame. As I understand it, John Ball preached social equality, that no one should be a gentleman as there was none in the garden of Eden. Not even the founders said that, they favored an equality of rights for free men.
I think the problem is that 'equality' today is never qualified, as it should. All men are equal in the eyes of god? Sure. All men should have equal rights? OK. All men are of equal worth? Eh... not so much.