The industry over-compensating for Covid and failing to adapt afterwards is a pattern I've witnessed personally multiple times, so that's definitely part of it.
I think the bloat of the industry, and hedge-fund investment desperately seeking to make a bajillion dollars with no investment is a terrible supply-side monetization strategy that should lead to the downfall of most of the AAA industry excluding DIE incompetence, rampant corruption, and the worst quality control anywhere in the consumer marketplace.
I think the real threat to AAA gaming is clearly from 3 sources:
AA Developers
Indie Developers
Backlog
For AA Development, it's very clear that games that cost 50% of AAA development can still very clearly make at least as much money, and still be significantly better.
For Indie Developers, they'll never make billions of dollars, but if you acquire enough of them under a publisher, you can get a good, continuous, revenue stream which is really what investors are looking for. An indie Franchise could be minimally funded and still knock out several solid games from their developers that sell reliably well and have a strong community.
For the backlog, before ShortFatOtaku went off the deep-end, he made a very good comment about the fact that if AAA gaming ceased to exist today, most gamers wouldn't feel it because the backlog of games is so immeasurably massive that we'd probably see older games re-spawn in popularity simply because a whole slew of gamers finally got around to playing it again. If your new games are so bad that they work and look as bad as games from 5 years ago, how are you going to compete with those exact games now that they are on sale again, or that they are already in my library and I just want to play it but never got the chance to. I don't think there's an answer to it.
Hopefully we can start to see some of these shithole developers, publishers, and corporations go under. "Nature is healing", just as soon as we start seeing Game Journos and DIE directors get made homeless and lose their careers.
If your new games are so bad that they work and look as bad as games from 5 years ago
I don't even think this is a fair metric. Its hard to beat the point that graphics are already at in the last 5-7~ years short of going full body scans on everything. In fact trying to constantly one up your previous iteration is what has driven the cost of AAA gaming so high for these companies, as they keep paying millions of dollars more so that way the handful of people with MEGA ULTRA displays can see 5 additional pores on the character's face.
The problem continues to be that AAA companies are obsessed with making sure to kill off their previous games to keep the newest one the only one being played. Rather than let people choose which is their favorite in the series to play based on its specific gimmick or iteration, they must always be chasing a new carrot to move everyone over with. Which means a shit ton of money spent trying to one up themselves over and over, even if they creatively don't have anything to do it with.
There is a lot more factors at play, but I think everyone, consumer and company, spend so long thinking about "how is this one better than what came before?" and not enough time on "what make's this one worth playing *as well?" Being better everytime is a losing game because eventually you run out of bugs to fix and optimizations to make, whereas being also worth playing is always possible.
The “diminishing returns” narrative is 100% bullshit. Modern game devs are simply incompetent, lazy, cheap, and dishonest. They use temporal rendering to hide shit design and skip out on optimization. They model everything with way too many polygons and then just drop it all directly into games with no consideration for rendering budgets. They slap on DLSS, producing horrific motion blur and artifacts and input latency, and then they blame performance issues on you not having a $2000 GPU. They flock to UE5 for convenience and ease of use and saving money, using crap systems like lumen and nanite, and then they refuse to tailor the engine to their needs. Worst of all, they say you’re crazy for noticing any of this. Meanwhile, source engine games from a decade ago have superior image clarity versus anything made today. But you wouldn’t know it from game trailers, which are always played with a controller to hide motion blurring during faster movement.
I wouldn't say the diminishing returns argument is bullshit. The work required for marginally more fidelity, at an asset level, has been increasing exponentially with each generation. As scene density increases, the asset count also increases. Couple the two and either budgets balloon or significant comprises must be made. The compromises often result in more generic looking assets (owing to extensive outsourcing with minimal art director, the use of photogrammetry at some stage of the pipeline or parametric tools drawing on a common data set). The pursuit of realism is a matter of diminishing returns, and it reduces creative freedom.
Same with the computational cost of rendering techniques. Compare early stencil shadows with PCSS. Now compare PCSS to RT shadows. The former is a significant leap, the latter not so much. Same can said for AO, reflections and GI. The problem with a lot of approximations is they usually proved to be unstable or heavily constrained. Cryengine's SVOGI, for example, can impose substantial constraints on how you build environments, especially interiors, and still suffers from ghosting and artifacts.
Rasterization based approximations had decades to mature and still weren't great. Developer laziness is inexcusable, but the biggest sin was selling real-time raytracing, which is desirable for any number of reasons, as viable multiple hardware generations too early and doing so before the associated tech had matured. And it was done to sell GPUs, not games.
the use of photogrammetry at some stage of the pipeline or parametric tools drawing on a common data se
Unless your game is a stylised Nintendo game, it's always cheaper to use photogrammetry, which will give you 1:1 realistic topology, and with today's pipelines you can use your smartphone for asset capture. It's also how Capcom managed to cut millions out of the budget of the newer Resident Evil games -- they talked about it in an interview a while back, using LiDAR for a bunch of the baked in scene decoration to avoid having to model the assets manually.
The real cost in "realistic" art assets is in retopology and baking a usable mesh into your runtime, which, funnily enough, requires a lot of time downscaling the mesh to work well within the engine pipeline (ergo, creating various LODs). But it's till faster/cheaper than manually making the asset, if you're actually a good modeler.
But as Current Horror pointed out, a lot of studios are now bypassing the optimisation phase by using Nanite to do the work for them, even though it results in horrible performance, which they attempt to bypass with frame generation and motion blur, using horrible temporal anti-aliasing to hide the bad optimisation, which results in smeary ghosting, which is really apparent in a ton of games.
The thing is, a lot of indie studios make a ton of photorealistic walking sims using photogrammetry or LiDAR on no-nothing budgets. It's not the "realism" that is costly, it's actually making a functional, optimised game out of those realistic assets that is costly, but most studios do not put in the time to do so.
The industry over-compensating for Covid and failing to adapt afterwards is a pattern I've witnessed personally multiple times, so that's definitely part of it.
I think the bloat of the industry, and hedge-fund investment desperately seeking to make a bajillion dollars with no investment is a terrible supply-side monetization strategy that should lead to the downfall of most of the AAA industry excluding DIE incompetence, rampant corruption, and the worst quality control anywhere in the consumer marketplace.
I think the real threat to AAA gaming is clearly from 3 sources:
For AA Development, it's very clear that games that cost 50% of AAA development can still very clearly make at least as much money, and still be significantly better.
For Indie Developers, they'll never make billions of dollars, but if you acquire enough of them under a publisher, you can get a good, continuous, revenue stream which is really what investors are looking for. An indie Franchise could be minimally funded and still knock out several solid games from their developers that sell reliably well and have a strong community.
For the backlog, before ShortFatOtaku went off the deep-end, he made a very good comment about the fact that if AAA gaming ceased to exist today, most gamers wouldn't feel it because the backlog of games is so immeasurably massive that we'd probably see older games re-spawn in popularity simply because a whole slew of gamers finally got around to playing it again. If your new games are so bad that they work and look as bad as games from 5 years ago, how are you going to compete with those exact games now that they are on sale again, or that they are already in my library and I just want to play it but never got the chance to. I don't think there's an answer to it.
Hopefully we can start to see some of these shithole developers, publishers, and corporations go under. "Nature is healing", just as soon as we start seeing Game Journos and DIE directors get made homeless and lose their careers.
I don't even think this is a fair metric. Its hard to beat the point that graphics are already at in the last 5-7~ years short of going full body scans on everything. In fact trying to constantly one up your previous iteration is what has driven the cost of AAA gaming so high for these companies, as they keep paying millions of dollars more so that way the handful of people with MEGA ULTRA displays can see 5 additional pores on the character's face.
The problem continues to be that AAA companies are obsessed with making sure to kill off their previous games to keep the newest one the only one being played. Rather than let people choose which is their favorite in the series to play based on its specific gimmick or iteration, they must always be chasing a new carrot to move everyone over with. Which means a shit ton of money spent trying to one up themselves over and over, even if they creatively don't have anything to do it with.
There is a lot more factors at play, but I think everyone, consumer and company, spend so long thinking about "how is this one better than what came before?" and not enough time on "what make's this one worth playing *as well?" Being better everytime is a losing game because eventually you run out of bugs to fix and optimizations to make, whereas being also worth playing is always possible.
The “diminishing returns” narrative is 100% bullshit. Modern game devs are simply incompetent, lazy, cheap, and dishonest. They use temporal rendering to hide shit design and skip out on optimization. They model everything with way too many polygons and then just drop it all directly into games with no consideration for rendering budgets. They slap on DLSS, producing horrific motion blur and artifacts and input latency, and then they blame performance issues on you not having a $2000 GPU. They flock to UE5 for convenience and ease of use and saving money, using crap systems like lumen and nanite, and then they refuse to tailor the engine to their needs. Worst of all, they say you’re crazy for noticing any of this. Meanwhile, source engine games from a decade ago have superior image clarity versus anything made today. But you wouldn’t know it from game trailers, which are always played with a controller to hide motion blurring during faster movement.
I wouldn't say the diminishing returns argument is bullshit. The work required for marginally more fidelity, at an asset level, has been increasing exponentially with each generation. As scene density increases, the asset count also increases. Couple the two and either budgets balloon or significant comprises must be made. The compromises often result in more generic looking assets (owing to extensive outsourcing with minimal art director, the use of photogrammetry at some stage of the pipeline or parametric tools drawing on a common data set). The pursuit of realism is a matter of diminishing returns, and it reduces creative freedom.
Same with the computational cost of rendering techniques. Compare early stencil shadows with PCSS. Now compare PCSS to RT shadows. The former is a significant leap, the latter not so much. Same can said for AO, reflections and GI. The problem with a lot of approximations is they usually proved to be unstable or heavily constrained. Cryengine's SVOGI, for example, can impose substantial constraints on how you build environments, especially interiors, and still suffers from ghosting and artifacts.
Rasterization based approximations had decades to mature and still weren't great. Developer laziness is inexcusable, but the biggest sin was selling real-time raytracing, which is desirable for any number of reasons, as viable multiple hardware generations too early and doing so before the associated tech had matured. And it was done to sell GPUs, not games.
Unless your game is a stylised Nintendo game, it's always cheaper to use photogrammetry, which will give you 1:1 realistic topology, and with today's pipelines you can use your smartphone for asset capture. It's also how Capcom managed to cut millions out of the budget of the newer Resident Evil games -- they talked about it in an interview a while back, using LiDAR for a bunch of the baked in scene decoration to avoid having to model the assets manually.
The real cost in "realistic" art assets is in retopology and baking a usable mesh into your runtime, which, funnily enough, requires a lot of time downscaling the mesh to work well within the engine pipeline (ergo, creating various LODs). But it's till faster/cheaper than manually making the asset, if you're actually a good modeler.
But as Current Horror pointed out, a lot of studios are now bypassing the optimisation phase by using Nanite to do the work for them, even though it results in horrible performance, which they attempt to bypass with frame generation and motion blur, using horrible temporal anti-aliasing to hide the bad optimisation, which results in smeary ghosting, which is really apparent in a ton of games.
The thing is, a lot of indie studios make a ton of photorealistic walking sims using photogrammetry or LiDAR on no-nothing budgets. It's not the "realism" that is costly, it's actually making a functional, optimised game out of those realistic assets that is costly, but most studios do not put in the time to do so.