For graphics chasing, yes. A lot of studios want their games to look real, rather than look lived in and stylized, like they used to. And it starts to make them look the same, even if it looks nice.
As for story, I think it was John Carmack who said that plot in a video game is nice, and it has to be there, but it's sort of like the plot in porn. Sure it's there, but it's not the main reason to be playing the game. Some RPGs, especially Bioware titles got a bunch of writers to make a compelling story, but a lot of that felt entirely unnecessary as it was basically window dressing in the game. It was there, and it was interesting, but if it wasn't there, I'd probably still have played through the game anyway.
Aside from the rot killing gaming, a few other things happened organically that is helping, too. Like the bloat.
In the corporate/AAA space, the workload got shifted to many, instead of few. Dev teams used to be 3-8 people depending on complexity of the title back in the SNES/Genesis era.
Then to 12-15 for the ps1/saturn/N64 era , then they hit 50, then 75, now there's hundreds working on a title, and in a few extremely rare cases, almost 500 people working on a single game. (If you add in all the foreign please do the needful dev support) That's a lot of outgoing money that has to be recouped by the game sales, as well as everyone else taking their cut.
So, game budgets have ballooned to movie size budgets to pay for all that. Ironic that they can pay up to 500 people to make a game, but expect QA to be done by one or two, if at all. But I digress. Even Borderlands 2's Dr Ned took a jab at that "If it compiles, ship it!"
Which means games now have had to cater to everyone in order to sell enough for the company to still exist. Which makes them all start to feel the same due to giving you what works, rather than a fun experimental and niche title like it used to be.
Innovation is also a double edged sword. When something innovative does happen, especially when it becomes a genre changing addition, other games will copy that innovation, sometimes in a game where it doesn't need to be, and then they all start feeling the same again.
As an example, In the mid 2010s people were tired of climbing towers to get the map revealed. It's something FarCry 3 did in 2012 that they repeated from FarCry 2, back when that felt new and exciting to do in that game. Then everyone did it. If you were playing an open world style of game, you were just expected to do a climbing puzzle just to see get the fog off a section of the map, in games that had many other options they could have used instead, and no business making you do that. But all of a sudden, they all did. Ubisoft specifically had a fetish for copying their other titles into their other open world games. So your Tom Clancy open world game felt like your FarCry open world game felt like your Assassin's Creed open world game felt like your The Division open world game.
Non ubisoft titles did it too. If you played Dying Light, there were towers to climb to reveal the map there too.
As well as shooters in general started to look and feel identical in the xbox360/ps3 era. There was an infamous screenshot collage where either 6 or 9 screenshots from 6 or 9 different games held the same gun and the caption at the top was "Name the game" and the only big difference was whether they had the green screen tint or the brown screen tint that companies loved to use. For a good five years it was green and bloom, or brown and bloom.
Or the blood on the screen during damage taken. Vestiges of that could be felt as far back as Goldeneye on N64, when you were hit you'd sharp gasp inhale and the edges of the screen would flash white as your armor/health got drained. That persists right up to today in Mad Max, FarCry 6, The Division 2, etc with damage being indicated as more and more blood splats on the screen edges. I'm sure Call of Duty and Battlefield probably still do it too. I just haven't played those in a decade.
So even if the rot hadn't touched gaming (as unlikely as that is) there would still be homogenization and samey feeling with very little innovation in terms of gameplay just due to needing to cater to everyone and make the same game as everyone else. The mentality of "It worked for them it'll work for us too" isn't helping anyone.
I also don't remember anyone saying "You know, I will just not buy this game at all unless it costs millions and millions to make" Most people were usually taken aback at the amount of people working on "Wow, 100 people worked on this?" without much else thought to it.
For graphics chasing, yes. A lot of studios want their games to look real, rather than look lived in and stylized, like they used to. And it starts to make them look the same, even if it looks nice.
As for story, I think it was John Carmack who said that plot in a video game is nice, and it has to be there, but it's sort of like the plot in porn. Sure it's there, but it's not the main reason to be playing the game. Some RPGs, especially Bioware titles got a bunch of writers to make a compelling story, but a lot of that felt entirely unnecessary as it was basically window dressing in the game. It was there, and it was interesting, but if it wasn't there, I'd probably still have played through the game anyway.
Aside from the rot killing gaming, a few other things happened organically that is helping, too. Like the bloat.
In the corporate/AAA space, the workload got shifted to many, instead of few. Dev teams used to be 3-8 people depending on complexity of the title back in the SNES/Genesis era.
Then to 12-15 for the ps1/saturn/N64 era , then they hit 50, then 75, now there's hundreds working on a title, and in a few extremely rare cases, almost 500 people working on a single game. (If you add in all the foreign please do the needful dev support) That's a lot of outgoing money that has to be recouped by the game sales, as well as everyone else taking their cut.
So, game budgets have ballooned to movie size budgets to pay for all that. Ironic that they can pay up to 500 people to make a game, but expect QA to be done by one or two, if at all. But I digress. Even Borderlands 2's Dr Ned took a jab at that "If it compiles, ship it!"
Which means games now have had to cater to everyone in order to sell enough for the company to still exist. Which makes them all start to feel the same due to giving you what works, rather than a fun experimental and niche title like it used to be.
Innovation is also a double edged sword. When something innovative does happen, especially when it becomes a genre changing addition, other games will copy that innovation, sometimes in a game where it doesn't need to be, and then they all start feeling the same again.
As an example, In the mid 2010s people were tired of climbing towers to get the map revealed. It's something FarCry 3 did in 2012 that they repeated from FarCry 2, back when that felt new and exciting to do in that game. Then everyone did it. If you were playing an open world style of game, you were just expected to do a climbing puzzle just to see get the fog off a section of the map, in games that had many other options they could have used instead, and no business making you do that. But all of a sudden, they all did. Ubisoft specifically had a fetish for copying their other titles into their other open world games. So your Tom Clancy open world game felt like your FarCry open world game felt like your Assassin's Creed open world game felt like your The Division open world game.
Non ubisoft titles did it too. If you played Dying Light, there were towers to climb to reveal the map there too.
As well as shooters in general started to look and feel identical in the xbox360/ps3 era. There was an infamous screenshot collage where either 6 or 9 screenshots from 6 or 9 different games held the same gun and the caption at the top was "Name the game" and the only big difference was whether they had the green screen tint or the brown screen tint that companies loved to use. For a good five years it was green and bloom, or brown and bloom.
Or the blood on the screen during damage taken. Vestiges of that could be felt as far back as Goldeneye on N64, when you were hit you'd sharp gasp inhale and the edges of the screen would flash white as your armor/health got drained. That persists right up to today in Mad Max, FarCry 6, The Division 2, etc with damage being indicated as more and more blood splats on the screen edges. I'm sure Call of Duty and Battlefield probably still do it too. I just haven't played those in a decade.
So even if the rot hadn't touched gaming (as unlikely as that is) there would still be homogenization and samey feeling with very little innovation in terms of gameplay just due to needing to cater to everyone and make the same game as everyone else. The mentality of "It worked for them it'll work for us too" isn't helping anyone.