The working world, when it was male dominant, was 100x more fun to work in. Guys could "shoot-the-shit" with each other, say whatever they wanted and competitiveness was seen as fun not as "toxic". The work environment when it was male dominant made people better because the men would compete with one another and strive to improve. Work was actually a lot more "fun" and in some ways it was a "safe-space" for men to get away from all the drama and toxicity that existed in their personal lives.
Women have utterly destroyed the workplace. Men's lives are inarguably much worse with women in the workplace. Nothing good has come of this.
Part of the challenge is that historically, women never raised kids in a setting alone. No woman before the last 100 years had to do what you are doing. Kids were raised in the company of many other children, teenagers, other moms, and also men who were around much more. Nearly everyone used to be farmers on plots of land next to their homes. And there weren't dangerous things everywhere, like chemicals, scissors, electrical sockets, etc. You could literally just let a kid roam and hope the wolves didn't get them. If a child died it was definitely sad, but mom wouldn't be thrown in prison, either.
I notice when I'm around my little nieces and nephews, the parents visibly relax and everyone just kind of keeps their eyes open for the kids. It's way more peaceful and I think it's better for the kids, too.
Not disagreeing with the bulk of your comment, but I'm sure if you're aware that 100 years ago was 1921. You know. The Roarin' 20's? Literally the decade that brought us automobiles, radio, and film? Huge economic growth with massive shifts in culture?
I treasure your historical knowledge and accuracy Mr. FutaCumDiet.