What would your major examples be of shows that started of well but then became woke or pushed the usual "trendy" agendas? The first one that comes to mind for me would be Westworld. The first season was amazing, and the second season wasn't so bad, but I stopped watching after a few episodes of the third season when the one lady took out a criminal organization and then didn't kill the boss of the organization because she respected a woman in power.
If I were to bet, I would say this next season of Stranger Things will really sideline the four guys and star the shoehorned lesbian chick. The last season was the last I was going to watch because it seemed to turn into another girl power show.
I know it is a movie but I thought Logan was a good movie but the more I think of it the more I realize that was the first example I saw of the trend of the man dying and passing the torch to a much better female. I can see the new Indiana Jones movie doing that.
I was going to mention Young Justice, I own the bluray for season 3 but I havent even watched it yet because of all the politics. Still bothered by aqualad being bi, the main part of his primary arcs in the previous two seasons revolved around his love for aqualass. Hell, he loved her so much, her death was a strong enough reason for basically everybody, even his best friends and teammates, to believe he would turn on the Justice League and join up with his villain father.
But hey, if you make him bi you can ignore that and just pretend it was always a part of his character, it just never came up before now. Also having a time jump between every season is annoying. It's just a lazy way to skip having to build up your new characters and storylines.