I guess I'm an old fart because I just assumed all this security type stuff was mostly done at the software level. But my machine language knowledge is much more like Z80/68k level or old 386. Although I'd probably still prefer the CPU just execute what it's told to without trying to check on the code anyway, because otherwise it's basically just TPM anyway.
I'm sure it's just general incompetence that's prevalent in the computer industry. I've never researched it, but Apple would be stupid to try to redesign an ARM chip from the ground up. It's probably more of an arrangement like the Xenon PPC chip in the Xbox 360 where Microsoft wanted to own the design so they could alter/scale it for cost in the future.
Apple designed the M1 chip using over $1 billon dollars of alleged R&D over 11 years, but the main feature is speed, and the second feature are special hardware instructions to emulate a intel x86 3 times faster than using pure software emulation, mainly endien byte order swap and such.
Apple wrote fancy LIBRARIES to aid emulation engineers port linux and video games :
The M1 in some configurations for 10,000 dollars is of course the fastest computer you can buy for science computation. [Apple Mac Studio with M1 Ultra]
New CPU benchmarks show that Apple’s powerful M1 Ultra chip outperforms Intel’s 12th-generation Alder Lake Core i9 12900K chip in multi-core performance, and AMD Ryzen 5950x, in both multi-core and single-core performance.
The paper states that the Pointer Authentication module was designed by ARM and released in 2017 as part of the ARMv8 Instruction Set. Usually what happens with ARM chips is a manufacturer (Apple in this case, though there are a bunch of others) will license a particular ARM design and package it with various other peripherals (eg. display controllers, USB, SATA, etc...) and memory to produce a single System On Chip (the M1 chip in this case). The manufacturer owns the SoC design, but part of that design is the CPU portion they licensed from ARM.
It's possible that ARM developed this sort of functionality at the behest of Apple (I've heard rumors that Intel has developed certain x86 functions at the behest of Amazon), but this looks like it's an ARM flaw rather than an Apple one.
Is there a lot of software that is only available for 32 bit ARM? Seems more like a racecar that can't drive on ice- which is to say almost every racecar.
The issue would be old software compiled for 32-bit x86 on Mac. They haven't supported such software for a while, though. 32-bit ARM code is a thing, but I don't think anyone ever used it on a Mac, so there's not a huge case for trying to run it. You probably can emulate it, though.
Not natively. We were talking about their emulation layer. I don't think their emulation layer works for 32-bit x86 code nor 32-bit ARM code. At least that's what it looked like when I googled it.
Since you seem to know a lot more about this stuff than I do: does this exploit affect the new M2 chip they just announced? And of course I've seen a lot of, what seemed to be, circle jerking about the chip on HN about how fast it was, but is that in general purpose usage or in specific scientific computation scenarios? Not a Mac guy at all and was just curious.
the second feature are special hardware instructions to emulate a intel x86 3 times faster than using pure software emulation, mainly endien byte order swap and such.
Aren't they both little endian? ARM used to support big endian and even mixed endian, however I think they dropped all big endian support.
Still, I would think those instructions are generally useful for dealing with peripherals.
Also old fart. I kind of wish I kept along the hardware side of things because when I read stuff like this, my mind is blown:
For decades now, computers have been speeding up processing using what’s called speculative execution. In a typical program, which instruction should follow the next often depends on the outcome of the previous instruction (think if/then). Rather than wait around for the answer, modern CPUs will speculate—make an educated guess—about what comes next and start executing instructions along those lines. If the CPU guessed right, this speculative execution has saved a bunch of clock cycles. If it turns out to have guessed wrong, all the work is thrown out, and the processor begins along the correct sequence of instructions. Importantly, the mistakenly computed values are never visible to the software.
The little fuckers basically time travel these days.
Speculative computation has been in use for at least a couple decades at this point. The AI being used to generate the precompiled logic today is scary. I can forsee a time when someone turns on a brand new computer, is denied access and immediately arrested for future criminal activity based off of what the webcam picks up in the surroundings.
I guess I'm an old fart because I just assumed all this security type stuff was mostly done at the software level.
No, because you really don't want the software to have that kind of access. You could prohibit speculative execution outright, and eat the resulting cost in speed as you have to perform the calculations in strict start-to-finish order, but you can't trust the software to perform these calculations ahead of time and pinky promise to discard that work properly if it's not needed.
I guess I'm an old fart because I just assumed all this security type stuff was mostly done at the software level.
Oh hell no.
Most of the state of the art attacks nowadays either hinge on using speculative execution to get the processor to cough up something it shouldn't, or excessive DRAM writes to induce spontaneous bit errors through electromagnetic coupling.
It's draws an interesting crowd. It was originally one of the most anti-credentialist, libertarian technical professions out there. Now its members seem to either veered into the far left or the far right. You don't really see very many (excluding perhaps those on H1B visas) who just want to figuratively grill.
I suspect (though I don't have any hard evidence of this) that we are significantly over-represented within the dissident right.
My experience has been that people reflect geography more than their profession. If you're in Silicon Valley, you're bound to encounter a ton of leftists. If you talk to computer engineers in Oklahoma, not so much. Of course the average is weighted towards coastal elites, so there's probably more of them tbh. But there's also a lot of overlap with gamers who tend to be reactionary.
Libertarianism isn't really either thing. You have left-libertarians and you have those that are conservatives. The main thing that's happened, that Elon Musk alluded to recently is that the woke, mainstream left, which isn't really even a hard left, has become anti-libertarian. As in, they literally hate liberty.
This is only bypassing a protection that no other consumer processors have anyway.
Basically each pointer has magic number in order to work and so an exploit only has a 1/16k chance of working and this protection is what's bypassed. You still need an exploit and access to use it just like on any other computer.
So M1 is just the same safe as an Intel/AMD not extra safe.
edit: if you want an example of a real boneheaded mistake Google created SPDY aka HTTP/2 and put it in their browser with it compressing private and public header data at the same time, which means any javascript could read all the cookies you had from any site. Afaik the ones behind it were not diversity hires but certainly were white knight fedora wearing idiots; they were so eager to destroy privacy with SPDY/HTTP 2 that they accidentally destroyed it too much and had egg on their face looking like freshman comp sci.
Intel have a few that go all the way back to pentium 1. Last I heard, it would be years before they fixed the 2 had found but they were not revealing because there could be no patchable security at the time.
No, the blame here is due to the desire for speed. Speculative execution is a huge performance boost for modern processors, but the problem is that it opens you up to side channel attacks. Nobody who tries to do speculative execution is immune to this, because that technique by definition leaks information on what the processor is doing.
Intel and AMD had the same problem, and most likely will continue to have the same problem. It can be fixed in software, but the fixes will lower your performance to a lesser or greater degree, and new exploits will likely continue to be found. To repeat, we are in this mess because people wanted to make processors faster, and there is no easy fix besides making them slower, which nobody wants to do for obvious reasons.
It's probably by design because the FBI/CIA/NSA wants every PC chip remote controllable
Literally my first thought was 'Oh, so they found the backdoor designed for the CIA.'
I guess I'm an old fart because I just assumed all this security type stuff was mostly done at the software level. But my machine language knowledge is much more like Z80/68k level or old 386. Although I'd probably still prefer the CPU just execute what it's told to without trying to check on the code anyway, because otherwise it's basically just TPM anyway.
I'm sure it's just general incompetence that's prevalent in the computer industry. I've never researched it, but Apple would be stupid to try to redesign an ARM chip from the ground up. It's probably more of an arrangement like the Xenon PPC chip in the Xbox 360 where Microsoft wanted to own the design so they could alter/scale it for cost in the future.
Apple designed the M1 chip using over $1 billon dollars of alleged R&D over 11 years, but the main feature is speed, and the second feature are special hardware instructions to emulate a intel x86 3 times faster than using pure software emulation, mainly endien byte order swap and such.
Apple wrote fancy LIBRARIES to aid emulation engineers port linux and video games :
https://developer.apple.com/documentation/apple-silicon/about-the-rosetta-translation-environment
https://news.ycombinator.com/item?id=23613995
The M1 in some configurations for 10,000 dollars is of course the fastest computer you can buy for science computation. [Apple Mac Studio with M1 Ultra]
https://appleinsider.com/inside/mac-studio/vs/compared-mac-studio-with-m1-max-versus-mac-studio-with-m1-ultra
M1 Ultra benchmarks versus Intel 12900K, AMD 5950x and Xeon W: https://www.ithinkdiff.com/m1-ultra-benchmarks-intel-12900k-amd-5950x-xeon/
The paper states that the Pointer Authentication module was designed by ARM and released in 2017 as part of the ARMv8 Instruction Set. Usually what happens with ARM chips is a manufacturer (Apple in this case, though there are a bunch of others) will license a particular ARM design and package it with various other peripherals (eg. display controllers, USB, SATA, etc...) and memory to produce a single System On Chip (the M1 chip in this case). The manufacturer owns the SoC design, but part of that design is the CPU portion they licensed from ARM.
It's possible that ARM developed this sort of functionality at the behest of Apple (I've heard rumors that Intel has developed certain x86 functions at the behest of Amazon), but this looks like it's an ARM flaw rather than an Apple one.
What's the fucking point? It's like having a racecar which is faster than any other, but only if it's on ice.
Is there a lot of software that is only available for 32 bit ARM? Seems more like a racecar that can't drive on ice- which is to say almost every racecar.
The issue would be old software compiled for 32-bit x86 on Mac. They haven't supported such software for a while, though. 32-bit ARM code is a thing, but I don't think anyone ever used it on a Mac, so there's not a huge case for trying to run it. You probably can emulate it, though.
But it's ARM anyway. It can't run x86 of any sort.
Not natively. We were talking about their emulation layer. I don't think their emulation layer works for 32-bit x86 code nor 32-bit ARM code. At least that's what it looked like when I googled it.
Countless legacy software, games, etc?
Since you seem to know a lot more about this stuff than I do: does this exploit affect the new M2 chip they just announced? And of course I've seen a lot of, what seemed to be, circle jerking about the chip on HN about how fast it was, but is that in general purpose usage or in specific scientific computation scenarios? Not a Mac guy at all and was just curious.
Aren't they both little endian? ARM used to support big endian and even mixed endian, however I think they dropped all big endian support.
Still, I would think those instructions are generally useful for dealing with peripherals.
Also old fart. I kind of wish I kept along the hardware side of things because when I read stuff like this, my mind is blown:
The little fuckers basically time travel these days.
That's really interesting. I need to read up on modern CPU architecture sometime just for my own interest.
Speculative computation has been in use for at least a couple decades at this point. The AI being used to generate the precompiled logic today is scary. I can forsee a time when someone turns on a brand new computer, is denied access and immediately arrested for future criminal activity based off of what the webcam picks up in the surroundings.
No, because you really don't want the software to have that kind of access. You could prohibit speculative execution outright, and eat the resulting cost in speed as you have to perform the calculations in strict start-to-finish order, but you can't trust the software to perform these calculations ahead of time and pinky promise to discard that work properly if it's not needed.
Oh hell no.
Most of the state of the art attacks nowadays either hinge on using speculative execution to get the processor to cough up something it shouldn't, or excessive DRAM writes to induce spontaneous bit errors through electromagnetic coupling.
It's draws an interesting crowd. It was originally one of the most anti-credentialist, libertarian technical professions out there. Now its members seem to either veered into the far left or the far right. You don't really see very many (excluding perhaps those on H1B visas) who just want to figuratively grill.
I suspect (though I don't have any hard evidence of this) that we are significantly over-represented within the dissident right.
My experience has been that people reflect geography more than their profession. If you're in Silicon Valley, you're bound to encounter a ton of leftists. If you talk to computer engineers in Oklahoma, not so much. Of course the average is weighted towards coastal elites, so there's probably more of them tbh. But there's also a lot of overlap with gamers who tend to be reactionary.
Libertarianism isn't really either thing. You have left-libertarians and you have those that are conservatives. The main thing that's happened, that Elon Musk alluded to recently is that the woke, mainstream left, which isn't really even a hard left, has become anti-libertarian. As in, they literally hate liberty.
Yes. The crypto-based. We live in the shadows. Literally, literally.
Yes. The Daystar is evil, and must be avoided!
Lol is that a BG2 reference? Or did you literally mean the sun?
This is only bypassing a protection that no other consumer processors have anyway.
Basically each pointer has magic number in order to work and so an exploit only has a 1/16k chance of working and this protection is what's bypassed. You still need an exploit and access to use it just like on any other computer.
So M1 is just the same safe as an Intel/AMD not extra safe.
edit: if you want an example of a real boneheaded mistake Google created SPDY aka HTTP/2 and put it in their browser with it compressing private and public header data at the same time, which means any javascript could read all the cookies you had from any site. Afaik the ones behind it were not diversity hires but certainly were white knight fedora wearing idiots; they were so eager to destroy privacy with SPDY/HTTP 2 that they accidentally destroyed it too much and had egg on their face looking like freshman comp sci.
So when not if.
But these chips identify as highly secure, so don’t worry!
Gotta have a back door.
M1 was designed by an Israeli
https://www.startpage.com/sp/search?query=intel+flaws+in+chips+decades+old&cat=web&pl=opensearch&language=english
Intel have a few that go all the way back to pentium 1. Last I heard, it would be years before they fixed the 2 had found but they were not revealing because there could be no patchable security at the time.
I’m just looking forward to the inevitable free M2 upgrade that Apple is going to hand out when they recall my M1 machine.
Oh hello lol
No, the blame here is due to the desire for speed. Speculative execution is a huge performance boost for modern processors, but the problem is that it opens you up to side channel attacks. Nobody who tries to do speculative execution is immune to this, because that technique by definition leaks information on what the processor is doing.
Intel and AMD had the same problem, and most likely will continue to have the same problem. It can be fixed in software, but the fixes will lower your performance to a lesser or greater degree, and new exploits will likely continue to be found. To repeat, we are in this mess because people wanted to make processors faster, and there is no easy fix besides making them slower, which nobody wants to do for obvious reasons.