Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple Silicon vulnerability leaks encryption keys, and can't be patched easily

Apple Silicon M2 in front of a MacBook

A new vulnerability in Apple Silicon chips can allow a determined attacker to access a user's data by stealing the cryptographic keys — and a fix could considerably impact encryption performance.

Researchers have discovered an issue with Apple's M-series chips, in how it deals with cryptographic operations, such as the encryption of files. However, since it is an issue with the chip's architectural design, it's something that's very difficult to mitigate.

Detailed on Thursday by a group of researchers and reported by ArsTechnica, the problem lies in the data memory-dependent prefetcher (DMP), which predicts memory addresses of data that will most likely be accessed by currently-running code. By prefetching data, it becomes a target for probing from malicious code.

This is because prefetchers are using previous access patterns to determine its predictions of the next bit of data to fetch. It is possible for an attacker to use this way of working to influence the data being prefetched, opening the door to accessing sensitive data.

GoFetch attack can steal encryption keys

The attack, referred to by the researchers by the name "GoFetch," takes advantage of a quirk in DMP usage in Apple Silicon. Specifically how a DMP could confuse the content of memory with pointer values used to load more data, with the former occasionally used as the latter.

In explaining the attack, the researchers confirm that it is possible to make the data "look like" a pointer, which the DMP will treat as an address location and in turn pull that data to the cache. The appearance of the address in the cache is visible, meaning malicious code can see it.

The attack manipulates data within the encryption algorithm to look like a pointer, using a chosen input attack. The DMP, seeing the data value as appearing like an address, then brings the data from that address, with the address itself being leaked.

The attack is not an instant crack of an encryption key. However, the attack can be carried out repeatedly, allowing the key to be revealed over time.

The GoFetch attack uses the same user privileges as many other third-party macOS apps, rather than root access. This lowers the barrier to entry for actually run the attack, but it's not entirely the whole story.

The GoFetch app running the attack must also be used on the same chip cluster as the cryptographic target app in order to function, and both must use the efficiency cores or the performance cores at the same time. It is cluster-dependent, meaning that it will still work if the apps are run on different cores within the same cluster.

The researchers claim the attack works against classic encryption algorithms as well as newer quantum-hardened versions.

As to its effectiveness, the researchers' test app was able to extract a 2,048-bit RSA key in less than an hour, and just over two hours for a 2,048-bit Diffie-Hellman key. Ten hours of data extraction is needed to secure a Dilithium-2 key, excluding offline processing time.

Difficult to thwart

The main problem with the attack is that it's one that cannot be patched in Apple Silicon itself, since its a central part of the design. Instead, it requires mitigations by the developers of cryptographic software to work around the problem.

The problem is that any mitigation changes will increase the workload required to perform the operations, in turn impacting performance. However, these impacts should only affect applications that use encryption and employ the mitigations, rather than other general app types.

In the case of one mitigation, ciphertext blinding, the effectiveness varies between algorithms, and can require twice the resources than usual.

Running the processes only on efficiency cores is also a possibility, since they do not have DMP functionality. Again, encryption performance will take a hit since it's not running on the faster cores.

A third option actually applies to M3 chips, in that a special bit can be flipped to disable DMP. The researchers admit they don't know the level of performance penalty that would occur.

Apple declined to comment to the report on the matter. The researchers claimed they performed a responsible disclosure to Apple before the public release, informing the company on December 5, 2023.

Some of the researchers previously worked on another discovery from 2022, also concerning Apple Silicon's DMP usage. At the time, the so-called Augury flaw was deemed to be not "that bad," and was "likely only a sandbox threat model."

History repeating

Chip vulnerabilities can be a big problem for device producers, especially if they have to make changes to operating systems and software in order to maintain security.

In 2018, Meltdown and Spectre chip flaws were discovered, which affected all Mac and iOS devices, as well as nearly every X86 device produced since 1997.

Those security exploits relied on "speculative executive," when a chip would improve its speed by working on multiple instructions simultaneously, or even out of order. As the name suggests, the CPU will speculatively continue executions down a path before a branch completes.

Both Meltdown and Spectre used the functionality to access "privileged memory," which could include the CPU kernel.

The discovery of the flaws led to a flood of other similar attacks, chiefly against Intel chips, including Foreshadow and Zombieload.

This is also not the first issue found with the design of Apple Silicon chips. In 2022, MIT researchers discovered an unfixable vulnerability dubbed "PACMAN," which capitalized on pointer authentication processes to create a side-channel attack.



19 Comments

twolf2919 149 comments · 2 Years

While definitely a vulnerability, I wish the author had highlighted how difficult it is to exploit.   Just telling me that the attack "must also be used on the same chip cluster as the cryptographic target app in order to function, and both must use the efficiency cores or the performance cores at the same time" doesn't really tell me that.  It sure sounds very esoteric - I'm not running a cluster on my MacBook (cluster of one?) nor do I really know what a 'cryptographic target app' means.

As most attacks, this requires you - at the very least - to download and run an app of unknown origin.  Don't do that.

VictorMortimer 239 comments · New User

twolf2919 said:
While definitely a vulnerability, I wish the author had highlighted how difficult it is to exploit.   Just telling me that the attack "must also be used on the same chip cluster as the cryptographic target app in order to function, and both must use the efficiency cores or the performance cores at the same time" doesn't really tell me that.  It sure sounds very esoteric - I'm not running a cluster on my MacBook (cluster of one?) nor do I really know what a 'cryptographic target app' means.

As most attacks, this requires you - at the very least - to download and run an app of unknown origin.  Don't do that.
So, stick to open source only?

Because the reality is that this can be hidden just about anywhere in a closed source app, and your precious app store can't protect you.


killroy 286 comments · 17 Years

Well, heck, if it ain't one thing, it's another.

maltz 507 comments · 13 Years

twolf2919 said:
As most attacks, this requires you - at the very least - to download and run an app of unknown origin.  Don't do that.

Well, it's not always that simple.  A couple of times a year there's a security issue that allows arbitrary code execution when processing an image or some other type of data - sometimes already in the wild.  If your un-patched phone visits a website with such malicious content, or sometimes even receive a text containing it, you've "downloaded an app of unknown origin" and run it without even knowing it.

Because the reality is that this can be hidden just about anywhere in a closed source app, and your precious app store can't protect you.

Sure it can - to a point.  The app store definitely scans app code for malicious activity such as this.  It's a cat-and-mouse game, though, as malware tries to obfuscate what it's doing.  So it's not perfect, but it's far from useless.

techlogik77 1 comment · New User

maltz said:
twolf2919 said:
As most attacks, this requires you - at the very least - to download and run an app of unknown origin.  Don't do that.

Well, it's not always that simple.  A couple of times a year there's a security issue that allows arbitrary code execution when processing an image or some other type of data - sometimes already in the wild.  If your un-patched phone visits a website with such malicious content, or sometimes even receive a text containing it, you've "downloaded an app of unknown origin" and run it without even knowing it.
Because the reality is that this can be hidden just about anywhere in a closed source app, and your precious app store can't protect you.
Sure it can - to a point.  The app store definitely scans app code for malicious activity such as this.  It's a cat-and-mouse game, though, as malware tries to obfuscate what it's doing.  So it's not perfect, but it's far from useless.

Was going to post something similar to the above. The hidden print flaw and 2 other zero days which a text message was sent, you never got it/received any notification of the text in imessage etc...and it exploited the 3 zero days in the background installing malicious software just waiting to be used later on. App store scanning doesn't mean much when it is typically zero day exploits that are a means to get your mac/iphone to perform some function to then exploit this vulnerability. But I guess the ultimate question here is...what exactly would be the case use of this flaw? Would like to see/hear some examples of how this could be used to perform some function/malicious thing. Like steal banking information/credentials or other sensitive things??

In the meantime, Devs will need to rebuild apps and push them out for M3 platform and disable that switch after some testing. Problem is...is the M3 now turned into an M1 with the performance hit or now an i9 Intel equivalent? They didn't do any testing with that. And all the other mitigation things for M1/M2 and my iPhone 12 Pro max doesn't sound like fun or good for performance.