This is an odd thought process, but I think a lot of people are stuck in a specific year with their thoughts. Let's take Remote Working or as it was called in the 90's, technomads. That's a 34 year old idea that was made before web browsers became a big thing.
In the medical world it's a common thing to see the latest equipment hooked up to a computer that is running windows XP or earlier. I've run a piece of equipment to a specialist fix it shop because the needed video card was made by ATI. The equipment isn't anything new and the manufacturer demands that the older computer be used. The modern hospital is about 20 years old if you ever think about it.
Someone here talked about Cyberpunk being outdated because our tech is even more powerful and doing even cooler things. People talk about it being around the corner, and don't realize we already passed it.
So, why are so many people and fields caught in years gone by? What is hampering them? How does this effect politics? What can be done to speed things up?
That's because the FDA approval process takes millions of dollars and years of work So once a company goes through that process, they would prefer to not do it again unless there's a good reason to do so. Upgrading a consumer-grade OS to the latest version that doesn't actually improve the product is often an insufficient reason by itself.
Though that aside, if modern web browsers still worked on XP I'd probably still use it. I consider XP-64 about as close to the perfect consumer OS as we've ever gotten from MS.
The manufacturer isn't allowed to change the component without going through some sort of change control process, and if the change is significant enough they may have to go through the entire approval process again. A graphics card might not trigger that change, but then again if it's an imaging device maybe the FDA would deem it a significant enough change to demand it.
Because when something works, there often is no need to change it. And newer is not always better. CS programs used to teach us "the nice thing about software is it never wears out". Of course then the industry figured out a way to make it wear out, though a continuous stream of incremental automatic updates which force people to keep everything up to date so as to not break anything.
I prefer the old way: at least the default of "do nothing" still left you with a functioning system instead of one that stops working over time, as newer and subtlety incompatible software releases are silently pushed onto your system and gradually break more and more things.