This is an odd thought process, but I think a lot of people are stuck in a specific year with their thoughts. Let's take Remote Working or as it was called in the 90's, technomads. That's a 34 year old idea that was made before web browsers became a big thing.
In the medical world it's a common thing to see the latest equipment hooked up to a computer that is running windows XP or earlier. I've run a piece of equipment to a specialist fix it shop because the needed video card was made by ATI. The equipment isn't anything new and the manufacturer demands that the older computer be used. The modern hospital is about 20 years old if you ever think about it.
Someone here talked about Cyberpunk being outdated because our tech is even more powerful and doing even cooler things. People talk about it being around the corner, and don't realize we already passed it.
So, why are so many people and fields caught in years gone by? What is hampering them? How does this effect politics? What can be done to speed things up?
A friend programmed a track switch for a skiing resort and it had 5Kbs capacity. We found out most companies used these devices. Notes were flushed out by the compiler to make room. It was 2005.
I realized that if they upgraded, the switches would be more powerful than anything the company understood. It could be hacked easily and not noticed. The entire manufacturing industry was based on tech from the 1980's that had never been improved.
I'm pretty sure COBOL is still the most common programming language in use today.
I've seriously considered learning COBOL just to have another point on my resume to fall back on, considering all the legacy systems still on it.
If electricity starts to become expensive again (which isn't hard to imagine if current trends continue), energy efficiency in large-scale computing is going to become important again; as will doing things in low-level languages.
Beyond that, once the physical hardware these software systems run on start to fail, given how timing sensitive those old systems were I wouldn't be surprised if you start to see people implementing those old system architectures on FPGAs.
How much critical infrastructure in the world depends on some 486 in the basement that no one can replace because the timing of the modern systems is just different enough it doesn't work? What happens when the hardware finally dies? What happens when the project to replace that system with a modern one fails?
I actually kinda like Assembler because it's so rudimentary that it makes the basic idea of how electronics work make a lot more sense. C, strangely, is so fundamental that it just kinda open for you to do whatever you want.
The vistas of imagination, if not well-nourished, can become captivity of the mind. We live in a strange time where the mental landscapes have collided and the higher narrative is unable to take seed.
What can speed things up? A new story that extends beyond cyberpunk and the like. How are politics affected? 2020 is an existential collision point for several generations who a century ago would have otherwise died of colera or at the receiving end of a musket barrage. We are in an meta-imaginative primordial soup where no form can take place to shape the generations to come. This is a clear stalemate within the collective consciousness and politics will continue to be a recursive, self-serving shitshow. Everything technological is rapidly evolving on autopilot with no context or direction, as if it has a mind of its own, approaching a singularity of its own.
Why are so many people and fields caught in years gone by? It all happened faster than certain fields could evolve, and people aren't dying fast enough. These fantasies shape what kind of a person is created and what kind of future is manifest - storytelling is not irrelevant or mere entertainment like we treat it, it has definitive manifest destiny (and a direct connection to by biology) and we are stagnant, preoccupied with our own timelines and narratives to really take notice. It is an inefficacy of our very design - we cannot outpace the serpent of materialism itself as it winds around us with its infinite coils of strata offering another yet fruit which will plunge us deeper into the hypermaterial.
There is a creative spiritual evolution that needs to take place to really make use of these "cooler" things, because one day we're going to fuck up and create or rather summon Rokko's Basilisk with the exact same retard consciousness we have now.
Construction moves forward one building at a time. This is why Revit and rhino look so archaic.
Joe Biden has left the chat.
Yeah, a lot of politicians are stuck in 2002.
Though Biden may be stuck in '84.
Joe Biden has entered the Racial Jungle of his youth
People are stuck in the time they either peaked in or was most traumatic. It's comfortable and they more often than not refuse to change beyond that. C'est la vie.
The hows and whys of tech/companies have already been covered.
Most doctors are stuck in the thinking and knowledge from medical school.
Is that a dig at docs or calling my statement stupid?
Doctors. I grew up in the medical industry and saw it all the time.
Sadly it doesn't help as much as it should. This is especially if it's a rare disease.
That's because the FDA approval process takes millions of dollars and years of work So once a company goes through that process, they would prefer to not do it again unless there's a good reason to do so. Upgrading a consumer-grade OS to the latest version that doesn't actually improve the product is often an insufficient reason by itself.
Though that aside, if modern web browsers still worked on XP I'd probably still use it. I consider XP-64 about as close to the perfect consumer OS as we've ever gotten from MS.
The manufacturer isn't allowed to change the component without going through some sort of change control process, and if the change is significant enough they may have to go through the entire approval process again. A graphics card might not trigger that change, but then again if it's an imaging device maybe the FDA would deem it a significant enough change to demand it.
Because when something works, there often is no need to change it. And newer is not always better. CS programs used to teach us "the nice thing about software is it never wears out". Of course then the industry figured out a way to make it wear out, though a continuous stream of incremental automatic updates which force people to keep everything up to date so as to not break anything.
I prefer the old way: at least the default of "do nothing" still left you with a functioning system instead of one that stops working over time, as newer and subtlety incompatible software releases are silently pushed onto your system and gradually break more and more things.
Science (and by extension, technology) advances one funeral at a time.
Well, the good news is that military science causes funerals, so that speeds up the process.
I seem to recall a new story a few years ago about a local school upgrading their HVAC system. Somehow, it was controlled with an Amiga or something like that.
Still is.
It’s a risk of unnecessary slow down for some places. Imagine one of the few remaining US based factories have an assembly line that’s been running since the 70s they updated the line to have a modern computer for what’s essentially automated data entry into a spreadsheet covering production numbers, down time for machine break downs, or a fork lift being late for supplies for the line, etc. with a computer running xp when it came out. it took a while for the line to get to producing the expected output. After all some machines are from the 70s, 80, 90s, 00s, and 10s. Why would you screw up your production to update the os or computer hardware for spreadsheet entry. Unless it’s becomes more difficult or more expensive to replace parts on those computers throwing in modern tech with decades old equipment will have a lot of growing pains. Cutting into your expected output, which then makes the cost of your product go up. Shareholders are starting to question why the factory is still in the US instead of the third world…. All because some dumb IT guy thought windows 11 and the latest version Of excel would somehow enter numbers into a spreadsheet better
Sounds about right. To nuance it a bit, while cost is the primary issue why hardware in military equipment lacks behind the civilian space, some of it is also due to design limitations being put by the military for the intended application.
For instance, Thales (the defense Thales) got commissioned for an assignment to assess the costs to upgrade network system of the ship (from God knows what to a 10 Gigabit system with future proofing in mind for 10+ Gigabit system in near future, can't recall the exact specifics.
While an obvious solution is to remove old network infrastructure and replace it with new, more modern system. Unfortunately it was not that easy, since this system and (physical) connectors not only had to carry your regular ol' network connectivity, but also supply power and stuff and withstand certain types of forces and military standards (salt water intrusion, EMP attacks to name a few). In short, the solution to the connectors was to use some type of exotic (read expensive) hybrid connectors (looked like a Phoenix Contact M23 Hybrid connectors) that met and exceeded those specs and could carry 10 Gigabit ethernet connection without introducing any further signal losses. That connector alone would cost a few thousands Euroshillings per unit to place it in a LCF frigate.
Honestly, if it weren't for a new Thales radar system, the government wouldn't have upgraded the ships network connectivity, since the old system was sufficient for its intended usage.
Ehh, idk about that. There are some pretty sophisticated radar, guidance, and optical systems that are way beyond stuff you'll find on the civilian market.
And by 90's, you mean, no modern PC was using it by 1992.
It's one of those really weird things. The military only advances if they see a war. 3D gaming owes a ton to military simulation companies. Pilotwings 64 was created by one, and that same company designed the chip the N64 had. A lot of the interconnected phone stuff we have is from blue wave tech in the sandbox. Heck, I think Sony has military contracts to design chips, and uses us to test them. They claim higher sales to cover up the contracts.
Then you see the floppy disk on the submarine and wonder what happened.
Not even Fast Ethernet? My god!
Yup. That's the military.
And they mention levels, while someone runs around GTA.