Incidentally don't take an Intro to Western Religion class if your only exposure was public school because they assume you know all the characters already.
They start off talking about how Proverbs 37:29 exemplifies what Ezekiel said to John the Blasphemous at the Mound and I'm like -- looks around -- wait a minute, this isn't my world.
Hell, the only thing that you really do learn of the Bible in public school is that Jesus of Nazareth was born in 4 BC, and executed by the Roman government in mid 20s AD. Or have they even taken that much out of World History textbooks?
EDIT: Guys, I said "World History," not "American/US History," you're not going to find Christ mentioned in the latter.
In America we talk about 2 main events. Colonization/pioneering and revolution. Later on, in college the plight of our Indians and blacks was explored.
Wait, what? They don't teach the formation of civilized hunter/gatherer society in Mesopotamia? How all sorts of early societies formed around rivers (the Themes, the Nile, etc.)? Notable leaders of said societies in history? The Caesars? The Louis? Marie "let them eat cake" Antoinette?
White people invented slavery, caught black wxmxyn from Wakanda with butterfly nets, then caused the holocaust, but a gay man solved an enigma code machine which ended wars forever, you now know all of public school history.
I remember a History class in 7th grade (~age 13-14) which covered an extremely brief bit on extremely early human cultures. We learned about Cuneiform, Ziggurats, Ashurbanipal, Spartans, etc, but only for one year. In my schools (~90% White, fairly nice) there was a very limited concern with everywhere else in the world outside of some tests on remembering every country in a content and maybe their capitals. Extremely in-depth studies on the things you mention would have been later electives, probably not even available in my high school.
In 10th grade we had World Cultures which I don't remember anything from except having to memorize a bunch of African countries & capitals, which I actively avoided doing. I remember telling my teacher that I don't care about Africa, it sucks, and I'm sick of seeing the 'starving African' commercials shoved in my face all the time trying to make me feel guilty about people I'll never meet in my life. That teacher actively disliked me a lot afterward.
A coach to a highschool team was sued because he told his winning team that he would go off school ground and would pray, and that they could join him if they wanted to.
Definitely haven't ever seen a bible anywhere in a school, including in a school library.
It's not just that bible's are banned, but Christendom itself is nearly banned.
The bible should be (and is) taught in churches and in the home, by those most qualified to teach it, the pastors and priests and preachers. School should be (but isn't) about learning math, grammar, some chemistry and physics and accountancy. Maybe some national history if they don't F it up.
I went to public school in a purple state. The Bible was mentioned precisely zero times in 13 years. These people live in an imaginary world.
What I was thinking. What school in modern times is teaching the Bible in a positive light today?
No public schools - heck, in many public schools any mention of anything Christian can get you reprimanded if not fired.
Unless you want to denigrate it then it’s open season
I had my freshman history teacher talking about how the bible was so sexist occasionally but that was it.
First time the bible was mentioned was in College, along with The Canterbury Tales in my English Literature class (which was an elective)
Incidentally don't take an Intro to Western Religion class if your only exposure was public school because they assume you know all the characters already.
They start off talking about how Proverbs 37:29 exemplifies what Ezekiel said to John the Blasphemous at the Mound and I'm like -- looks around -- wait a minute, this isn't my world.
"Where does Stone Cold come into play?"
I don’t remember the Bible ever being taught in public school.
Hell, the only thing that you really do learn of the Bible in public school is that Jesus of Nazareth was born in 4 BC, and executed by the Roman government in mid 20s AD. Or have they even taken that much out of World History textbooks?
EDIT: Guys, I said "World History," not "American/US History," you're not going to find Christ mentioned in the latter.
In America we talk about 2 main events. Colonization/pioneering and revolution. Later on, in college the plight of our Indians and blacks was explored.
My US history classes - Elementary school: columbus, pilgrims, civil rights. Middle school: indians, holocaust, civil rights. High School: slavery, "progressive era", worshiping FDR, holocaust, civil rights.
And this was in late 80s/early 90s, which is why I still got columbus and pilgrims. I can't imagine how bad it is now.
In response to your edit. Public schools in the US don't teach world history. That's why they're confused.
Wait, what? They don't teach the formation of civilized hunter/gatherer society in Mesopotamia? How all sorts of early societies formed around rivers (the Themes, the Nile, etc.)? Notable leaders of said societies in history? The Caesars? The Louis? Marie "let them eat cake" Antoinette?
White people invented slavery, caught black wxmxyn from Wakanda with butterfly nets, then caused the holocaust, but a gay man solved an enigma code machine which ended wars forever, you now know all of public school history.
I remember a History class in 7th grade (~age 13-14) which covered an extremely brief bit on extremely early human cultures. We learned about Cuneiform, Ziggurats, Ashurbanipal, Spartans, etc, but only for one year. In my schools (~90% White, fairly nice) there was a very limited concern with everywhere else in the world outside of some tests on remembering every country in a content and maybe their capitals. Extremely in-depth studies on the things you mention would have been later electives, probably not even available in my high school.
In 10th grade we had World Cultures which I don't remember anything from except having to memorize a bunch of African countries & capitals, which I actively avoided doing. I remember telling my teacher that I don't care about Africa, it sucks, and I'm sick of seeing the 'starving African' commercials shoved in my face all the time trying to make me feel guilty about people I'll never meet in my life. That teacher actively disliked me a lot afterward.
Not in highschool, and it's an elective in University.
WHAT THE FUCK HAPPENED TO EDUCATION THE PAST TWO DECADES??? I learned most of that stuff BEFORE high school!
Mormons and delusional jojo fans would disagree ;)
Mormons get a paragraph in US History books. There is more about the Wizard of Oz bring an Allegory
"Of course my political cult counts as a religious belief!"
Isn't it already more or less banned?
“Why have a religious book when you ban sexual grooming and made up history?!”
So critical race theory is religious dogma? Sorry, I already knew that. Nice of them to admit it though.
Admitting of course that anti white is a religious thing for them.
Make it compulsory.
Projection and strawmans are all they have.
A coach to a highschool team was sued because he told his winning team that he would go off school ground and would pray, and that they could join him if they wanted to.
Definitely haven't ever seen a bible anywhere in a school, including in a school library.
It's not just that bible's are banned, but Christendom itself is nearly banned.
So they admit its a religion?
The book or teaching it? Either way I accept your proposal. Let's throw in "gender theory" as well.
I see no issue with this.
The bible should be (and is) taught in churches and in the home, by those most qualified to teach it, the pastors and priests and preachers. School should be (but isn't) about learning math, grammar, some chemistry and physics and accountancy. Maybe some national history if they don't F it up.
Math? Best I can do is tranny porn and LGBT grooming.