I have a very good excuse for watching a Hollywood movie, I swear, pls no bully.
Anyway, regardless of the quality of the movie - a bit difficult to follow and somewhat romcommy (or romcommie according to some people here), it surprised me that there were no really overtly woke elements, and that woke was even mocked at some point.
Avoiding spoilers for something no one here is ever going to watch, the two woke-adjacent elements were a guy who appeared somewhat gay (other than that, no LGBTP crap), and a female character with surprising prowess in fights.
But also: even the 'good' female characters were presented as obsessive and self-centered and as the butt of jokes. And one of the bad female characters at one point cited "toxic masculinity" when she was caught and trying in vain to appeal to a fellow female to not side against her when she is clearly in the wrong.
So either movies are getting less woke or (more likely) this is a nice exception to the rule.
I think woke is being toned down a notch, not that they think it is bad but because they need money.
I'm willing to bet that money in Hollywood is worse then we expect.
Yep. Anything "based" is just pandering for money. The people behind the scenes still hate us as much as they ever have. They took off the mask. I'm not stupid enough to fall for it now that they are trying to put it back on. They can live off of the profits from the modern audience they wanted so much.
I can't see the phrase
MODERN AUDIENCES
without adding the reverb in my head.
Probably, just look at Disney, they're at record lows from what we KNOW, if we saw their books we'd be scoping out what we can take when they go bankrupt.