22
posted ago by KingLion7 ago by KingLion7 +23 / -1

Much has been said in the past how the job industry has gone bonkers over the last couple decades. Jobs that didn't used to need college degrees require them, the crazy vax requirements, placing diversity requirements, etc. But when did the job titles get so fake and gay, and the job descriptions as well? Example, I was looking up jobs for hotels and there was crap like "food and beverage expert." The titles are fluffed up shit, and the descriptions frequently make no sense as well. When did that become the norm in America? For those of you not from America, is it like that in other countries?