One my good friend's and his wife usually do a cookout and have a little party for every season of Stranger Things. He asked if I would be going this year and I said no. He actually has gotten tired of the show but his wife still wants to watch. I thought the second season was worse but still didn't annoy me until the third season. I have reached a point of being so tired of the "girls who are much better than men" trope that is in everything and to top it off, they have the "insert unnecessary lgbt character"
I remember reading that she was supposed to be added as a romantic subplot for the reformed jock (makes sense for a show paying homage to the 80s) but then they decided to subvert expectations. So I would imagine the next season the four original boys will be background characters so the girls can shine. Also, that little sister was annoying, but of course the critics raved and said that lesbian character was the best one.
Breaking Bad is the only drama I can think of that doesn't either get canceled or crawl up its own ass.
I feel like season 5 Breaking Bad did kind of suck though.
S01-S04 Walt could be seen as pretty much justified in his choices; it was him or them, he was providing for his family and that's what a man does - a man provides.
But even back in 2013 it wasn't acceptable for a man to provide so he has to ham-handedly say it was never for them, he's working with literal nazis, and nobody can profit from drugs (because drugs bad) so they have to needlessly kill side-character Andrea, and so on.
You can say that's just a continuing evolution of his character, but it seems like a mad scramble to unwrite the previous four seasons to post-hoc remove justifications for his actions because people had accepted those as positives (self-defense, david vs goliath, guile/ingenuity).
I agree, they later said that the whole series was always meant to be a case study of how an audience can be conditioned to support an immoral character, and the producers wanted to test how far they could push him before most of the audience stops supporting him - with S5 being when most people broke. Walt was always the bad guy.
I found that hard to believe with some of the story beats, and the unpredictable nature of TV productions.
What I'm getting at is that what the writers believed was evil wasn't actually what actual people agree is evil.
A man providing for his family isn't evil, but to a woke it's the evil patriarchy.
They set out to see how evil they could make him, but they actually found out how tragically heroic they could make him. Does he make even a single decision out of malice? I can't think of any - until season 5. Walt isn't even the same character in S1-S4 and S5, and there's no explanation for it other than the show ending.
Well, that explains why I couldn't make it past the first season, because that's exactly what it came across as.
This is my problem with these shows and I think maybe the true brainwashing: getting people to sympathize with evil. Providing for your family by selling drugs isn't noble, it's evil from the get go.
Breaking Bad needed to end the second Hank learned. The song and dance after had a few great moments, but if the show ended right there, it would have been a pretty kino final shot.
I think I read somewhere that Breaking Bad was written in it's entirety, with the ending already planned before the pilot ever even aired, and they stuck to it. That's why it was so good.
They made lots of big changes. Jesse was supposed to be killed by Tuco in the first season for instance.
I think it was good because of smart writers, but also because it was at the time the only thing AMC had going for it (still the crown jewel after Walking Dead). The show had all the power and the network couldn't step in and make bullshit hollywood changes. Gilligan really had them over a barrel. "Make Walt have a gay sex scene with Jesse" and Vince is like "fuck off" - probably happened hundreds of times.