Much has been said in the past how the job industry has gone bonkers over the last couple decades. Jobs that didn't used to need college degrees require them, the crazy vax requirements, placing diversity requirements, etc. But when did the job titles get so fake and gay, and the job descriptions as well? Example, I was looking up jobs for hotels and there was crap like "food and beverage expert." The titles are fluffed up shit, and the descriptions frequently make no sense as well. When did that become the norm in America? For those of you not from America, is it like that in other countries?
You're viewing a single comment thread. View all comments, or full comment thread.
Comments (24)
sorted by:
The economic collapse is because of Dem policies. They want to do two things: turn the economy green and fight a proxy-war in Ukraine, except that the one precludes the other due to the obvious. If you do both, you kill the economy.
Yes, but this has been in the works for quite a long while now, I think that's the main point people have been making. They've been taking over whole industries and making sure that only the brainwashed morons from their universities could ever hope of getting a job, they want loyalists not experts. What's amazing is they even went so far as to attack the games industry which realistically was just a small cottage industry back in the day and now they've turned it into this woke, corporatised monstrosity.