This can't be stressed enough. "Game programmers" today are really engine scriptwriters. They didn't make the game - they licensed a framework for a game from somebody else and then write scripts and textures and models to populate it. Optimization happens in the backend, the engine, and the modern game programmer doesn't work with the guts of that at all.
Triple-a's have the resources for in-house engines or deep modifications to licensed middleware. Their problem is that biz-dev departments (I.e. disconnected from consuming or developing the product) have an even larger share of production effecting decisions than they did versus the mid 00s. Current industry's scope and scheduling philosphy is incongruous with making works of art, thus they're not committing the right resources to robust architecture and other foundations.
I think Bethesda wrote the engine. Supposedly things like managing a ton of items got better in Starfield, though that's not something I really get into.
Everybody is already lazy with the fact that they can just fix it in updates later on, but the optimization issues are certainly the main problem. They load in the entire map in maximum detail such that a quantum supercomputer would struggle to run it. Emulation is so great in one aspect because people will fix it themselves, I know the Twilight Princess GC version has a major issue when run on Dolphin that I believe is from the game loading the entirety of the main hub of the map at once, similar to many open worlds today. The difference is that one guy dropped an optimization patch that's like 2 megabytes and it goes away forever. These AAA gaming companies seemingly play whack-a-mole with their optimization problems centered on poor design early on in the game's development.
Much like the entire concept of DLC, this problem can be entirely traced back to Oblivion, and then pushed to the stratosphere with Skyrim. Fallout New Vegas as well for sheer bugs.
People handwaved away every technical problem possible with "well mods can fix it" or "yeah but the game is so good regardless!" and turned them into massive hits despite being barely functional jokes of duct tape and prayers.
And these were in the early days of "later patches" to fix bugs, so for most people they were just left with the absolutely broken on disc version with no expectation of the game being fixed later unless an unpaid modder did it for the company.
Its an absolute travesty the level of bullshit people just handwaved off for years despite it being an obvious problem at the time, which makes it all the more frustrating that many of the same people complain about those problems years too late to stop it.
The largest problem with the gaming industry now is, you have two main GPU manufacturers which have at least a dozen relevant cards at any one time. Thats ignoring all the different hardware, hdd vs ssd, memory limitations etc. Its not an excuse to not test your game especially when you're a huge company, but the above is ignoring consoles too; Take Cyberpunk 2077 as the perfect example, they were told to ship it on last generation consoles- To which the game wouldn't run and was a bug fest. Play it on a modern PC? No where near as many issues.
I wouldn't be surprised if most of it is due to the developers cutting costs, a decent QA team could cost (especially a small company) a fortune.
Next issue is most of the OG's (the developers who did it out of pure passion, so Doom, Roller coaster tycoon, Fallout (1/2) ) are leaving the industry or have already retired. Many will just see it as "It's a job, I get to make games which is pretty cool" but they'll be a very small cog in a very big bureaucratic wheel, so regardless of them wanting to make something exceptional; They still have so much red tape to jump through.
Sidenote: From my own personal experience of the gaming industry (granted it isn't a whole lot) its very very hostile. To the degree where you even get recruiters treating you like shit; Even though you're a potential payout to them. Which just makes me think that the people who actually work in it either don't have a huge amount of self respect or they're too smitten with the idea of being a game developer.
Sidenote: From my own personal experience of the gaming industry (granted it isn't a whole lot) its very very hostile. To the degree where you even get recruiters treating you like shit; Even though you're a potential payout to them. Which just makes me think that the people who actually work in it either don't have a huge amount of self respect or they're too smitten with the idea of being a game developer.
They're smitten with being a developer, so supply is higher than demand and the employer gets to treat the employee like crap. The animation industry is the same thing, from a secondhand source.
I'm paraphrasing something I saw years, of not decades ago. In the early days, hardware was so limited and primitive that software had to be made incredibly well optimized to operate. Now with how absurd the hardware is, software can be bloated to all hell and the PC's hardware can, for the most part, power through it. Of course this laziness has cascaded to the current day where software is getting too fat for even beefy hardware to support.
This is one reason I still have some respect for Nintendo. They actually polish their games for launch and keep them in a small filesize. Of course the reason for that is because they're developing for the Switch, but it's good practice to do either way.
I remember Doom3 and Half Life 2 were designed to play on any graphics card, including a voodoo FX. The ATI 9700 meant a fake image at the end of a hallway to make it look like a cool lighting effect. They really did think of stuff like that.
You don't know CTD pain until you've played a full campaign of Misery for STALKER Call of Pripyat. 32-bit executable with 4gb limited memory. Memory leaks like a faucet. And I'm sure it was just as painful if not more in some large mods for Shadow of Chernobyl.
Or a modded out Xcom 2 that is... not happy with the abuse you've caused it.
Granted, in these cases, it's because of mods trying to squeeze every bit out of a game that the game was never designed to be able to handle it, lol.
Or a modded out Xcom 2 that is... not happy with the abuse you've caused it.
Sadly accurate. I really need to figure out what causes Avengers defence missions to just hang in my playthroughs because not only does it obviously prevent me from playing, but I actually really enjoy those missions so it's not like I want to simply skip them to avoid the problem.
Give how many hours I dumped into a lagtastic modded setup, you'd think I'd remember more clearly what I suspected was going on with some of the screwy behavior.
Ah, wait, I think I'm starting to remember now. At least for what I'd experienced with my setup.
I think something in my total mod setup did not carry over cleanly into save files made during a mission, so while technically the game wouldn't completely crash most of the time or have any glaringly obvious issues, there'd still be a lot of errors going on in the background leading to all kinds of strange delays and performance issues with rendering, AI decisions, animations, etc.
I certainly empathize though. I'd be so stoked for some of those bigger fights, but usually it's precisely those fights that led me to the previously described scenario because I had to save scum like crazy.
Its not like any of the Stalker games ran better without mods. You'd probably end up more stable regardless of how hefty it was simply because they'd have had to fix how broken the game was in the process of making the mod.
Oh I agree. The problems were just compounded by a lot of bigger mods like Misery that pushed the extremely low memory limit to the brink because of all the additional textures, meshes, and object references.
Also there was a strange unfixable vanilla bug with the gun-toting zombies iirc that would inevitably corrupt any saves made within a certain radius (only if there was just one alive when you made that save).
The hbs battletech game is the same sort of way with the big mods. The game wasn't ever the best optimized, and some mods (rogue tech) add so much to the game it can slow it way down.
These days I don't buy games that demand you to pay over $50. Those games are usually are the one hogging all the available disk space the devs are too lazy to optimise and usually have current year memes or microtransactions. If there are the devs that release a game under 20gb, that is a good sign considerable effort has been put into the game. And it seems following that guideline made me avoid all the hype and still get decent games that actually play better than Triple A goyslop.
PC gaming is already reaching Warhammer levels of stupid where the foundation is no longer understood by newer generations of programmers. Hacked together code like a Frankenstein's monster it's a wonder anything works at all. Soon we'll reach a point we won't build anything new and rely on what we still have for as long as it can possibly last. Wait, did I say soon? It's already happening.
I think they were in dev settings. And, it clearly isn't a team process between the two companies. The thing that irritated me was how could an update be designed to look like an, " illegal mod"? Did the person that made the mod come up with the method? Was the mod inspiration?
All good points. But, the fact is that holding people's purchases hostage is fuck you refund everything actions. Too many people put up with their behavior.
When it happened to the very first person there should have been more collaboration.
I'm not a fan of digital libraries either. But, the amount of money people spend on intangible items is crazy! It takes one person to get everything charged back for them to stop. My dad has multiple accounts more than 20 yrs old on multiple platforms. I think of his gaming library everytime these things happen.
The link especially had me angry for people. An update shouldn't cause so much drama.
I've been pondering an objective structure to use to define a "good" game versus a "bad" game, with varying amounts of luck. "Good" being a game with systems that magnify and/or differentiate gameplay more than they get in the way of if, with a "bad" game doing the opposite. Attempting to extract personal tastes from game reviewing is proving as difficult as it is useful.
Functionality is definitely a big part of it. Like Starfield would get very few positive points for it's shooting mechanic because it's so basic(and buggy), and there's a lot of shit that gets in the way without adding a lot of variability. Games like Roboquest would get much higher marks, because the shooting is better made and has systems in place to greatly differentiate player experience within it.
Starfield: The models are detailed, but whether it can get FPS depends on what you're looking at. They don't appear to have much of a mechanism for simplifying far-away objects, at least that I can tell. So if you actually try to look at the panoramas that the game generously sets up for you, it goes all slow.
On my 3080 laptop, there are games I can run in 4k/60 fps. I wouldn't say it's most of them, but it can be done. Starfield I'm running at 60% resolution, less then 2K, so it's already getting a break.
And for laptops, not being able to do real 4K has been the situation for years. i remember trying to play Total War on a 1080 SLI, struggling, switching to 2K. Every GPU since has been advertised as 4k-capable. And they're not, really.
This is more because of consoles. Optimising for PC takes actual work and expertise, which is something devs don't have any more.
This can't be stressed enough. "Game programmers" today are really engine scriptwriters. They didn't make the game - they licensed a framework for a game from somebody else and then write scripts and textures and models to populate it. Optimization happens in the backend, the engine, and the modern game programmer doesn't work with the guts of that at all.
Triple-a's have the resources for in-house engines or deep modifications to licensed middleware. Their problem is that biz-dev departments (I.e. disconnected from consuming or developing the product) have an even larger share of production effecting decisions than they did versus the mid 00s. Current industry's scope and scheduling philosphy is incongruous with making works of art, thus they're not committing the right resources to robust architecture and other foundations.
I think Bethesda wrote the engine. Supposedly things like managing a ton of items got better in Starfield, though that's not something I really get into.
It shouldn't but apparently most gamers are too retarded to care.
Because most are console players, or worse phone "gamers".
What modern software? Most of the engine backbones used today date back to the mid 2000s.
Everybody is already lazy with the fact that they can just fix it in updates later on, but the optimization issues are certainly the main problem. They load in the entire map in maximum detail such that a quantum supercomputer would struggle to run it. Emulation is so great in one aspect because people will fix it themselves, I know the Twilight Princess GC version has a major issue when run on Dolphin that I believe is from the game loading the entirety of the main hub of the map at once, similar to many open worlds today. The difference is that one guy dropped an optimization patch that's like 2 megabytes and it goes away forever. These AAA gaming companies seemingly play whack-a-mole with their optimization problems centered on poor design early on in the game's development.
Much like the entire concept of DLC, this problem can be entirely traced back to Oblivion, and then pushed to the stratosphere with Skyrim. Fallout New Vegas as well for sheer bugs.
People handwaved away every technical problem possible with "well mods can fix it" or "yeah but the game is so good regardless!" and turned them into massive hits despite being barely functional jokes of duct tape and prayers.
And these were in the early days of "later patches" to fix bugs, so for most people they were just left with the absolutely broken on disc version with no expectation of the game being fixed later unless an unpaid modder did it for the company.
Its an absolute travesty the level of bullshit people just handwaved off for years despite it being an obvious problem at the time, which makes it all the more frustrating that many of the same people complain about those problems years too late to stop it.
The largest problem with the gaming industry now is, you have two main GPU manufacturers which have at least a dozen relevant cards at any one time. Thats ignoring all the different hardware, hdd vs ssd, memory limitations etc. Its not an excuse to not test your game especially when you're a huge company, but the above is ignoring consoles too; Take Cyberpunk 2077 as the perfect example, they were told to ship it on last generation consoles- To which the game wouldn't run and was a bug fest. Play it on a modern PC? No where near as many issues.
I wouldn't be surprised if most of it is due to the developers cutting costs, a decent QA team could cost (especially a small company) a fortune.
Next issue is most of the OG's (the developers who did it out of pure passion, so Doom, Roller coaster tycoon, Fallout (1/2) ) are leaving the industry or have already retired. Many will just see it as "It's a job, I get to make games which is pretty cool" but they'll be a very small cog in a very big bureaucratic wheel, so regardless of them wanting to make something exceptional; They still have so much red tape to jump through.
Sidenote: From my own personal experience of the gaming industry (granted it isn't a whole lot) its very very hostile. To the degree where you even get recruiters treating you like shit; Even though you're a potential payout to them. Which just makes me think that the people who actually work in it either don't have a huge amount of self respect or they're too smitten with the idea of being a game developer.
They're smitten with being a developer, so supply is higher than demand and the employer gets to treat the employee like crap. The animation industry is the same thing, from a secondhand source.
I'm paraphrasing something I saw years, of not decades ago. In the early days, hardware was so limited and primitive that software had to be made incredibly well optimized to operate. Now with how absurd the hardware is, software can be bloated to all hell and the PC's hardware can, for the most part, power through it. Of course this laziness has cascaded to the current day where software is getting too fat for even beefy hardware to support.
This is one reason I still have some respect for Nintendo. They actually polish their games for launch and keep them in a small filesize. Of course the reason for that is because they're developing for the Switch, but it's good practice to do either way.
Unless your name is Gamefreak, in which case you produce completely inexcusable pieces of trash and still sell millions.
The fact the world's largest franchise is still tied up with these morons is incredible.
I remember Doom3 and Half Life 2 were designed to play on any graphics card, including a voodoo FX. The ATI 9700 meant a fake image at the end of a hallway to make it look like a cool lighting effect. They really did think of stuff like that.
You don't know CTD pain until you've played a full campaign of Misery for STALKER Call of Pripyat. 32-bit executable with 4gb limited memory. Memory leaks like a faucet. And I'm sure it was just as painful if not more in some large mods for Shadow of Chernobyl.
Or a modded out Xcom 2 that is... not happy with the abuse you've caused it.
Granted, in these cases, it's because of mods trying to squeeze every bit out of a game that the game was never designed to be able to handle it, lol.
Sadly accurate. I really need to figure out what causes Avengers defence missions to just hang in my playthroughs because not only does it obviously prevent me from playing, but I actually really enjoy those missions so it's not like I want to simply skip them to avoid the problem.
Give how many hours I dumped into a lagtastic modded setup, you'd think I'd remember more clearly what I suspected was going on with some of the screwy behavior.
Ah, wait, I think I'm starting to remember now. At least for what I'd experienced with my setup.
I think something in my total mod setup did not carry over cleanly into save files made during a mission, so while technically the game wouldn't completely crash most of the time or have any glaringly obvious issues, there'd still be a lot of errors going on in the background leading to all kinds of strange delays and performance issues with rendering, AI decisions, animations, etc.
I certainly empathize though. I'd be so stoked for some of those bigger fights, but usually it's precisely those fights that led me to the previously described scenario because I had to save scum like crazy.
Its not like any of the Stalker games ran better without mods. You'd probably end up more stable regardless of how hefty it was simply because they'd have had to fix how broken the game was in the process of making the mod.
Oh I agree. The problems were just compounded by a lot of bigger mods like Misery that pushed the extremely low memory limit to the brink because of all the additional textures, meshes, and object references.
Also there was a strange unfixable vanilla bug with the gun-toting zombies iirc that would inevitably corrupt any saves made within a certain radius (only if there was just one alive when you made that save).
The hbs battletech game is the same sort of way with the big mods. The game wasn't ever the best optimized, and some mods (rogue tech) add so much to the game it can slow it way down.
These days I don't buy games that demand you to pay over $50. Those games are usually are the one hogging all the available disk space the devs are too lazy to optimise and usually have current year memes or microtransactions. If there are the devs that release a game under 20gb, that is a good sign considerable effort has been put into the game. And it seems following that guideline made me avoid all the hype and still get decent games that actually play better than Triple A goyslop.
PC gaming is already reaching Warhammer levels of stupid where the foundation is no longer understood by newer generations of programmers. Hacked together code like a Frankenstein's monster it's a wonder anything works at all. Soon we'll reach a point we won't build anything new and rely on what we still have for as long as it can possibly last. Wait, did I say soon? It's already happening.
Improvisation is rarely to be welcomed, and then only when the Omnissiah’s will demands it.
But yep it is all code copied from older code copied from even older code with very little knowledge of why.
Total War is very much "do little work on the engine but add a lot of content to sell DLC". TW is what it is, but at least it's a good game.
It's funny that they worked with Intel on performance of Troy, and that actually worked. Troy runs 4K/60 for me in a way that few other games do.
Does this issue cause you any grief?
https://gamevro.com/vac-ban-causes-100k-loss-to-counter-strike-2-player/
I think they were in dev settings. And, it clearly isn't a team process between the two companies. The thing that irritated me was how could an update be designed to look like an, " illegal mod"? Did the person that made the mod come up with the method? Was the mod inspiration?
All good points. But, the fact is that holding people's purchases hostage is fuck you refund everything actions. Too many people put up with their behavior.
When it happened to the very first person there should have been more collaboration.
I don't mind. No one else I asked thought it was the issue I did.
I'm not a fan of digital libraries either. But, the amount of money people spend on intangible items is crazy! It takes one person to get everything charged back for them to stop. My dad has multiple accounts more than 20 yrs old on multiple platforms. I think of his gaming library everytime these things happen.
The link especially had me angry for people. An update shouldn't cause so much drama.
I've been pondering an objective structure to use to define a "good" game versus a "bad" game, with varying amounts of luck. "Good" being a game with systems that magnify and/or differentiate gameplay more than they get in the way of if, with a "bad" game doing the opposite. Attempting to extract personal tastes from game reviewing is proving as difficult as it is useful.
Functionality is definitely a big part of it. Like Starfield would get very few positive points for it's shooting mechanic because it's so basic(and buggy), and there's a lot of shit that gets in the way without adding a lot of variability. Games like Roboquest would get much higher marks, because the shooting is better made and has systems in place to greatly differentiate player experience within it.
Starfield: The models are detailed, but whether it can get FPS depends on what you're looking at. They don't appear to have much of a mechanism for simplifying far-away objects, at least that I can tell. So if you actually try to look at the panoramas that the game generously sets up for you, it goes all slow.
On my 3080 laptop, there are games I can run in 4k/60 fps. I wouldn't say it's most of them, but it can be done. Starfield I'm running at 60% resolution, less then 2K, so it's already getting a break.
And for laptops, not being able to do real 4K has been the situation for years. i remember trying to play Total War on a 1080 SLI, struggling, switching to 2K. Every GPU since has been advertised as 4k-capable. And they're not, really.
need patch on day one, I'm sorry mate.
oh thank you for your 70 dollars purchase!