What would your major examples be of shows that started of well but then became woke or pushed the usual "trendy" agendas? The first one that comes to mind for me would be Westworld. The first season was amazing, and the second season wasn't so bad, but I stopped watching after a few episodes of the third season when the one lady took out a criminal organization and then didn't kill the boss of the organization because she respected a woman in power.
If I were to bet, I would say this next season of Stranger Things will really sideline the four guys and star the shoehorned lesbian chick. The last season was the last I was going to watch because it seemed to turn into another girl power show.
I know it is a movie but I thought Logan was a good movie but the more I think of it the more I realize that was the first example I saw of the trend of the man dying and passing the torch to a much better female. I can see the new Indiana Jones movie doing that.
The TV show version of The Mist was really bad from the beginning and only got worse. I loved the story growing up and liked the movie as well and was excited that they had made a show based on it. But there was one redditor-type character who was just insufferable. The real horror was what they did to a once decent story. I'd give more examples but I don't want to spoil it for the masochists out there who would still watch it anyways.