As a woman I spent most of my working life in 'men's' jobs because I hate working indoors and I hate office politics. That pretty much left me with doing men's work. I always liked it. I wasn't fucking "empowered", they were good jobs with a good paycheck.
An honest days' work for an honest days' pay
I'm old fashioned. Just a blue collar worker.
P.S. I didn't downvote you, this sub has its own peanut gallery of trolling feministas.
Girlwriteswhat years ago uploaded videos talking about women in history who made careers in male dominated fields. She did this because she is a anti-feminist who wanted to blow holes in their claims of women always being held down and held back.
I think most of us men don't have an issue with women who want to work in whatever area they want to with the obvious exceptions of jobs that the vast majority of women are a really bad fit. Such as police, fire, front line military, etc. Because these jobs are quite literally where lives are on the line.
Honestly the big problem is not that women are working (it is an issue but it is not the defining issue). The defining issue is that it is now literally men who are being held back and punished for just being born male. It is clearly wrecking the west and I don't mean just for men, I mean it is literally destroying the west.
I think most of us men don't have an issue with women who want to work in whatever area they want to with the obvious exceptions of jobs that the vast majority of women are a really bad fit.
After working in a few all male environments over the course of my career this is no longer the case for me. I don't care what field they work in so long as I don't have to work with them. All male work environments have proven to be more productive, more supportive both professionally and personally and to be perfectly frank they're more enjoyable. Women just destroy the dynamic that naturally emerges on a team of men. It's not even something they actively or maliciously do, the environment just doesn't take the same form when they are present.
As a woman I spent most of my working life in 'men's' jobs because I hate working indoors and I hate office politics. That pretty much left me with doing men's work. I always liked it. I wasn't fucking "empowered", they were good jobs with a good paycheck.
I'm old fashioned. Just a blue collar worker.
P.S. I didn't downvote you, this sub has its own peanut gallery of trolling feministas.
Girlwriteswhat years ago uploaded videos talking about women in history who made careers in male dominated fields. She did this because she is a anti-feminist who wanted to blow holes in their claims of women always being held down and held back.
I think most of us men don't have an issue with women who want to work in whatever area they want to with the obvious exceptions of jobs that the vast majority of women are a really bad fit. Such as police, fire, front line military, etc. Because these jobs are quite literally where lives are on the line.
Honestly the big problem is not that women are working (it is an issue but it is not the defining issue). The defining issue is that it is now literally men who are being held back and punished for just being born male. It is clearly wrecking the west and I don't mean just for men, I mean it is literally destroying the west.
After working in a few all male environments over the course of my career this is no longer the case for me. I don't care what field they work in so long as I don't have to work with them. All male work environments have proven to be more productive, more supportive both professionally and personally and to be perfectly frank they're more enjoyable. Women just destroy the dynamic that naturally emerges on a team of men. It's not even something they actively or maliciously do, the environment just doesn't take the same form when they are present.