That last part is because women expect to be saved from their own poor choices, while men know nobody is going to come and save them. Hell, when I graduated from college some churches even expected college educated men to not only pay off their own loans, but their girlfriends' loans as well.
Keep in mind, I am not saying men shouldn't improve themselves and that modern men don't have something to learn from older generations. I am not saying that at all. It just seems like more of a way to lay more responsibility onto young men and shift the blame of a shitty society squarely at their feat while giving then watered down red pill, or even flat out blue pill advice.
Athletes aren't warriors (with the exception, apparently, of Russian hockey players... course that might just be because they are Russian), entertainers aren't priests, merchants should not be in charge of culture, and politicians aren't leaders. They are all the fake and gay versions of what we really need.
Unfortunately, there's a good reason why those women get such jobs. Thanks to diversity inclusion women often get high paying roles even if they're nowhere near good enough to perform at such level. Any show that's going to take the female point of view is not going to have the women grind to get where they need to go.
Demolition Man, 1984, Brave New World, and evangelicals were all correct in their predictions but wrong in they didn't realize how bad it would get. Probably the only one who got it right was that Civil War era Confederate (I forgot his name) who predicted progress would lead to transgenderism.
Well, what do you know. Hispanics really do do the jobs Americans aren't willing to do.