What would your major examples be of shows that started of well but then became woke or pushed the usual "trendy" agendas? The first one that comes to mind for me would be Westworld. The first season was amazing, and the second season wasn't so bad, but I stopped watching after a few episodes of the third season when the one lady took out a criminal organization and then didn't kill the boss of the organization because she respected a woman in power.
If I were to bet, I would say this next season of Stranger Things will really sideline the four guys and star the shoehorned lesbian chick. The last season was the last I was going to watch because it seemed to turn into another girl power show.
I know it is a movie but I thought Logan was a good movie but the more I think of it the more I realize that was the first example I saw of the trend of the man dying and passing the torch to a much better female. I can see the new Indiana Jones movie doing that.
Not sure it would be a Prime Example, but Walking Dead went from a really fun zombie apocalypse show about a father trying to save his son, with excellent cinematography, to CW diversity quota quality.
Tell me about it. I stopped watching in 18. I said I’d finish the comics and be done. Comics have a strong father ending which was a pleasant surprise
I stuck it out until they offed the kid. I was mostly rubbernecking at that point. I just got so bad. Honestly, I hadn't really enjoyed the majority of it since S6, but I trudged on in hopes of another Nebraska episode.
My bad.
I still think about that one episode from season 4. The one with Carol and the two little girls. That shit was haunting. Looking at the show now, it's insane how far it has fallen.
That episode was around the time I stopped watching.
Right around when they actually arrive at The Terminus.
It started going downhill after the second episode. I quit when it turned into a chick flick in the farmhouse.
EVERYONE IS GAY!!!