Every time I go to the grocery store I pass by the book section on the way to the register and they always have those romance novels with muscular men on the covers. Nothing wrong with that since those are made to appeal to women, but I can't help but be a little annoyed sometimes because appealing to men in any way causes wailing and gnashing of teeth from the usual suspects and corporations are dumb enough to listen to them.
It would make for a good silly movie to have a man take over some romance novel publisher or a network that produces romance movies and completely changes them or subverts expectations a la Stranger Things season 3 where the couple seems to be headed for romance but the guy reveals he is gay at the end (one of the dumbest things I've seen in the woke era since they admitted the original plan in that season was for those two to get together) or they just turn the romance company into an action adventure/spy thriller company and then when the fans get upset, they call them all sorts of names.
Another thing would be to use out of shape men with beer guts and lecture the women about the female gaze and how evil they are to want to see an attractive man.
Yes, that is petty but I guess I just find it annoying that marketing to women is encouraged but marketing to men's tastes is somehow evil nowadays.
You should put a picture of a transman over the hunky dude in the romance novel and see how angry the Leftist white women get when their tingles are disturbed. Bonus points for including a sign that says "Transmen are men"