Games as well. Tech has some real cool shit these days but with 99% of the people trying to use it being drooling retards it looks like complete ass even compared to some things that came out 10+ years ago where they had actual skilled developers that knew how to properly model & tweak things to get them to look as good as they could and not run like hot garbage then relay on scaling crap to make up for it (eat shit Remnant 2).
Doesn't have to do anything with generational decline and has everything to do with how industry works. At the start of game development, in 80s-90s, there was no internet and no silicon valley boom and so programming jobs were rather limited in scope. There wasn't as much headhunting and salaries were not diverging as drastically.
At this moment it's mirror opposite - there's always need for more programmers and, when it comes to payment, traditional gamedev is least competitive of it all. Even mobile gamedev salaries can be 2x-3x to what a PC coder earns. So it's the dregs that are developing the modern AAA titles - proficient specialists are highly unlikely to find themselves there. Unless they're fanatics but that burns out fast and they soon leave the industry for greener pastures.
This comment is in denial. When AAA hires, they have top-down mandates to diversify. If you think the talent pool for video game dev is bad because low salaries, what happens when you further dilute that pool with gender and race requirements? You end up being forced to hire barely functional (but highly diverse) retards. How many such devs do you need to throw at a AAA production? Thousands, apparently. And the end results are Diablo 4 and Starfield.
In some respects the scope of the objective when creating a new engine was a bit smaller back then, and thus a little easier to program from scratch.
Although at the same time, a lot of programming tools back then were also more prone to bugging out and breaking, especially compilers.
Still, I generally agree that game developers of yesteryear were exceedingly competent by comparison. Producing extremely solid games within 1-2 years sometimes, with teams no larger than 20. And while yes, some might say, "games are more complex than they used to be", the actual workload required for said features in a lot of games does not actual line up with the amount of bloat that companies end up hiring.
Hells, game developers used to have to create most animations by hand, without motion capture even being on the table.
Pointers make more sense if coming from machine/assembly. I recommend Code by Charles Petzold for a general overview of computing, and few days of any form of assembly to really get C pointers.
There's no reason to write an entire game engine from scratch besides wanting to do ir, which is perfectly fine. You're just not going to write one and then make and publish a game with it unless you're a one in a million type of designer.
Games as well. Tech has some real cool shit these days but with 99% of the people trying to use it being drooling retards it looks like complete ass even compared to some things that came out 10+ years ago where they had actual skilled developers that knew how to properly model & tweak things to get them to look as good as they could and not run like hot garbage then relay on scaling crap to make up for it (eat shit Remnant 2).
Doesn't have to do anything with generational decline and has everything to do with how industry works. At the start of game development, in 80s-90s, there was no internet and no silicon valley boom and so programming jobs were rather limited in scope. There wasn't as much headhunting and salaries were not diverging as drastically.
At this moment it's mirror opposite - there's always need for more programmers and, when it comes to payment, traditional gamedev is least competitive of it all. Even mobile gamedev salaries can be 2x-3x to what a PC coder earns. So it's the dregs that are developing the modern AAA titles - proficient specialists are highly unlikely to find themselves there. Unless they're fanatics but that burns out fast and they soon leave the industry for greener pastures.
This comment is in denial. When AAA hires, they have top-down mandates to diversify. If you think the talent pool for video game dev is bad because low salaries, what happens when you further dilute that pool with gender and race requirements? You end up being forced to hire barely functional (but highly diverse) retards. How many such devs do you need to throw at a AAA production? Thousands, apparently. And the end results are Diablo 4 and Starfield.
In some respects the scope of the objective when creating a new engine was a bit smaller back then, and thus a little easier to program from scratch.
Although at the same time, a lot of programming tools back then were also more prone to bugging out and breaking, especially compilers.
Still, I generally agree that game developers of yesteryear were exceedingly competent by comparison. Producing extremely solid games within 1-2 years sometimes, with teams no larger than 20. And while yes, some might say, "games are more complex than they used to be", the actual workload required for said features in a lot of games does not actual line up with the amount of bloat that companies end up hiring.
Hells, game developers used to have to create most animations by hand, without motion capture even being on the table.
Pointers make more sense if coming from machine/assembly. I recommend Code by Charles Petzold for a general overview of computing, and few days of any form of assembly to really get C pointers.
There's no reason to write an entire game engine from scratch besides wanting to do ir, which is perfectly fine. You're just not going to write one and then make and publish a game with it unless you're a one in a million type of designer.
I'll bet most developers these days couldn't write "hello world" in C.