What would your major examples be of shows that started of well but then became woke or pushed the usual "trendy" agendas? The first one that comes to mind for me would be Westworld. The first season was amazing, and the second season wasn't so bad, but I stopped watching after a few episodes of the third season when the one lady took out a criminal organization and then didn't kill the boss of the organization because she respected a woman in power.
If I were to bet, I would say this next season of Stranger Things will really sideline the four guys and star the shoehorned lesbian chick. The last season was the last I was going to watch because it seemed to turn into another girl power show.
I know it is a movie but I thought Logan was a good movie but the more I think of it the more I realize that was the first example I saw of the trend of the man dying and passing the torch to a much better female. I can see the new Indiana Jones movie doing that.
Rick & Morty. Seasons 1&2 were great. In season 3 the original writers went woke and started hiring female writers with little experience just for the sake of having female writers. Not surprisingly they started pushing woke stuff.
NCIS and its spinoffs. They were pretty good for quite a long time. However all three series had a woke push at the same time. Within a few episodes of each other they all replaced white, usually male, characters with black, usually female, characters. At the same time they also had a bunch of other lefty story arcs along the lines of evil internet trolls attacking stronk female politicians and Muslim terrorists are just misunderstood.
Do the Deadpool movies count as a show? The first one was one of the best movies ever. 2 was a total woke cringefest.
These days everything either starts out shit or inevitably turns to shit. Which is why I barely watch anything recent.
Season 4 and 5 are pretty decent though, maybe they ditched the female writers.