Also old fart. I kind of wish I kept along the hardware side of things because when I read stuff like this, my mind is blown:
For decades now, computers have been speeding up processing using what’s called speculative execution. In a typical program, which instruction should follow the next often depends on the outcome of the previous instruction (think if/then). Rather than wait around for the answer, modern CPUs will speculate—make an educated guess—about what comes next and start executing instructions along those lines. If the CPU guessed right, this speculative execution has saved a bunch of clock cycles. If it turns out to have guessed wrong, all the work is thrown out, and the processor begins along the correct sequence of instructions. Importantly, the mistakenly computed values are never visible to the software.
The little fuckers basically time travel these days.
Speculative computation has been in use for at least a couple decades at this point. The AI being used to generate the precompiled logic today is scary. I can forsee a time when someone turns on a brand new computer, is denied access and immediately arrested for future criminal activity based off of what the webcam picks up in the surroundings.
Also old fart. I kind of wish I kept along the hardware side of things because when I read stuff like this, my mind is blown:
The little fuckers basically time travel these days.
That's really interesting. I need to read up on modern CPU architecture sometime just for my own interest.
Speculative computation has been in use for at least a couple decades at this point. The AI being used to generate the precompiled logic today is scary. I can forsee a time when someone turns on a brand new computer, is denied access and immediately arrested for future criminal activity based off of what the webcam picks up in the surroundings.