What would your major examples be of shows that started of well but then became woke or pushed the usual "trendy" agendas? The first one that comes to mind for me would be Westworld. The first season was amazing, and the second season wasn't so bad, but I stopped watching after a few episodes of the third season when the one lady took out a criminal organization and then didn't kill the boss of the organization because she respected a woman in power.
If I were to bet, I would say this next season of Stranger Things will really sideline the four guys and star the shoehorned lesbian chick. The last season was the last I was going to watch because it seemed to turn into another girl power show.
I know it is a movie but I thought Logan was a good movie but the more I think of it the more I realize that was the first example I saw of the trend of the man dying and passing the torch to a much better female. I can see the new Indiana Jones movie doing that.
It's definitely true for Westworld. The first season was very good, but the 2nd was pretty bad even if it didn't seem to add in too many progressive elements. I didn't get around to watching the 3rd season at all.
Game of Thrones as well. I enjoyed the 1st season. 2-4 were good too, but I think I stopped watching partway through the 5th just because I didn't want to bother any more.
Doctor Who probably counts too, but I haven't seen any of it and just know what others have said.
Yea I stopped watching doctor who a few episodes into the female doctor. I said I’d give her a chance despite the fact it was done for no reason but pandering but the “cosmic white supremacists” episode was the last straw.
With Game of Thrones I think if GRRM had finished the show would’ve turn out much better