What would your major examples be of shows that started of well but then became woke or pushed the usual "trendy" agendas? The first one that comes to mind for me would be Westworld. The first season was amazing, and the second season wasn't so bad, but I stopped watching after a few episodes of the third season when the one lady took out a criminal organization and then didn't kill the boss of the organization because she respected a woman in power.
If I were to bet, I would say this next season of Stranger Things will really sideline the four guys and star the shoehorned lesbian chick. The last season was the last I was going to watch because it seemed to turn into another girl power show.
I know it is a movie but I thought Logan was a good movie but the more I think of it the more I realize that was the first example I saw of the trend of the man dying and passing the torch to a much better female. I can see the new Indiana Jones movie doing that.
Doctor Who, Arrow, Westworld, Warrior, The Expanse, Into The Badlands, Black Sails, Young Justice, Iron Fist....
Stranger Things was watchable in S1, although it was never great. Afterwards it was nothing but Girl Power bullshit.
Even Seal Team went that way a bit in S4, which really annoyed me, because it was so good up to that point.
I was going to watch black sails but someone told me it wasn’t worth it. Iron Fist seemed to fold to pressure.
What happened to Seal Team? Fighting white supremacy or something?
There was a completely unnecessary story arch about sexual harassment in the Navy, which served no purpose except to distract from the characters and stories that everyone actually watches the show for.
Oh yeah I'm sure there's a ton in the SEALS
Fuckin A