Every time I go to the grocery store I pass by the book section on the way to the register and they always have those romance novels with muscular men on the covers. Nothing wrong with that since those are made to appeal to women, but I can't help but be a little annoyed sometimes because appealing to men in any way causes wailing and gnashing of teeth from the usual suspects and corporations are dumb enough to listen to them.
It would make for a good silly movie to have a man take over some romance novel publisher or a network that produces romance movies and completely changes them or subverts expectations a la Stranger Things season 3 where the couple seems to be headed for romance but the guy reveals he is gay at the end (one of the dumbest things I've seen in the woke era since they admitted the original plan in that season was for those two to get together) or they just turn the romance company into an action adventure/spy thriller company and then when the fans get upset, they call them all sorts of names.
Another thing would be to use out of shape men with beer guts and lecture the women about the female gaze and how evil they are to want to see an attractive man.
Yes, that is petty but I guess I just find it annoying that marketing to women is encouraged but marketing to men's tastes is somehow evil nowadays.
I mean, they will but you probably focused on the wrong lessons from them.
Such as "already be attractive" and "if you are attractive, you can literally do anything you want" and most importantly "if lessons 1 and 2 are fulfilled she wants you to do the most disgusting vile shit you can think of to her."
Yea “be attractive and don’t be unattractive” is the basic advice. Or any advice a woman gives you needs to be understood as “this is what I want the guy I find attractive to do”. I worked in a mostly female call center when 50 shades came out and the book was being read by almost all of them. I read it because I was curious and it was definitely eye opening