What would your major examples be of shows that started of well but then became woke or pushed the usual "trendy" agendas? The first one that comes to mind for me would be Westworld. The first season was amazing, and the second season wasn't so bad, but I stopped watching after a few episodes of the third season when the one lady took out a criminal organization and then didn't kill the boss of the organization because she respected a woman in power.
If I were to bet, I would say this next season of Stranger Things will really sideline the four guys and star the shoehorned lesbian chick. The last season was the last I was going to watch because it seemed to turn into another girl power show.
I know it is a movie but I thought Logan was a good movie but the more I think of it the more I realize that was the first example I saw of the trend of the man dying and passing the torch to a much better female. I can see the new Indiana Jones movie doing that.
I can't recall at the moment because it would've been way earlier than Westworld. I do remember wokeness rearing it's head in Westworld S2, along with the other examples of stupid writing. (Dolores becoming a Mary Sue. The Man in Black being destroyed.) And perhaps this isn't an example of what you're talking about, but S2 had very little female nudity but no problems with waving dongs in everyone's face. People online were saying it's because #metoo happened between S1 and S2. I didn't subscribe to HBO for dongs.