This is an odd thought process, but I think a lot of people are stuck in a specific year with their thoughts. Let's take Remote Working or as it was called in the 90's, technomads. That's a 34 year old idea that was made before web browsers became a big thing.
In the medical world it's a common thing to see the latest equipment hooked up to a computer that is running windows XP or earlier. I've run a piece of equipment to a specialist fix it shop because the needed video card was made by ATI. The equipment isn't anything new and the manufacturer demands that the older computer be used. The modern hospital is about 20 years old if you ever think about it.
Someone here talked about Cyberpunk being outdated because our tech is even more powerful and doing even cooler things. People talk about it being around the corner, and don't realize we already passed it.
So, why are so many people and fields caught in years gone by? What is hampering them? How does this effect politics? What can be done to speed things up?
I'm pretty sure COBOL is still the most common programming language in use today.
I've seriously considered learning COBOL just to have another point on my resume to fall back on, considering all the legacy systems still on it.
If electricity starts to become expensive again (which isn't hard to imagine if current trends continue), energy efficiency in large-scale computing is going to become important again; as will doing things in low-level languages.
Beyond that, once the physical hardware these software systems run on start to fail, given how timing sensitive those old systems were I wouldn't be surprised if you start to see people implementing those old system architectures on FPGAs.
How much critical infrastructure in the world depends on some 486 in the basement that no one can replace because the timing of the modern systems is just different enough it doesn't work? What happens when the hardware finally dies? What happens when the project to replace that system with a modern one fails?
I actually kinda like Assembler because it's so rudimentary that it makes the basic idea of how electronics work make a lot more sense. C, strangely, is so fundamental that it just kinda open for you to do whatever you want.