Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

This might be how law enforcement agencies break into the iPhone

Credit: Blocks/Unsplash

Last updated

A group of cryptography experts have proposed a theory about how law enforcement can still break into iPhone despite continuous iOS patches and layers of safeguards — Apple's strongest encryption protects less data than it used to.

Matthew Green, an associate professor at Johns Hopkins Information Security Institute, proposed the theory in a Twitter thread on Wednesday in response to news of the ACLU suing for information about iPhone unlocking methods. The theory is based on research from two of his students, Maximilian Zinkus and Tushar M. Jois.

Green contends that law enforcement agencies no longer need to break the strongest encryption on an iPhone because not all types of user data are protected by it.

The research was prompted by the fact that forensic companies reportedly no longer have the ability to break Apple's Secure Enclave Processor. That means it's very difficult to crack a iPhone's password. Given that law enforcement agencies continue to break into locked devices, Green and his students began researching how that could be possible.

They came up with a possible answer, which Green said would be fully detailed in a report after the holidays. Although it's conjecture, it could explain how government and police entities are still able to extract data from locked iPhones.

It boils down to the fact that an iPhone can be in one of two states: Before First Unlock (BFU) and After First Unlock (AFU). When you first power up your device and enter your passcode, it goes into the AFU state. When a user types in their code, the iPhone uses it to derive different sets of cryptographic keys that stay in memory and are used to encrypt files.

When a user locks their device again, it doesn't go into BFU, but remains in the AFU state. Green notes that only one set of cryptographic keys gets purged from memory. That set stays gone until a user unlocks their iPhone again.

The purged set of keys is the one used to decrypt a subset of an iPhone's files that fall under a specific protection class. The other key sets, which stay in memory, are used to decrypt all other files.

From here, all a law enforcement entity needs to do is use known software exploits to bypass the iOS lock screen and decrypt most of the files. Using code that runs with normal privileges, they could access data like a legitimate app. As Green points out, the important part appears to be which files are protected by the purged set of keys.

Based on Apple's documentation, it appears that the strongest protection class only applies to mail and app launch data.

Comparing that to the same text from 2012, it seems that the strongest encryption doesn't safeguard as many data types as it once did.

The data types that don't get the strong protection include Photos, Texts, Notes, and possibly certain types of location data. Those are all typically of particular interest to law enforcement agencies.

Third-party apps, however, are able to opt-in to protect user data with the strongest protection class.

As far as why Apple seems to have weakened the protections, Green theorizes that the company forfeited maximum security to enable specific app or system features like location-based reminders. Similarly, some apps wouldn't be able to function properly if the strongest encryption class was used for most data.

Green notes that the situation is "similar" on Android. But, for Apple, the cryptography professor says that "phone encryption is basically a no-op against motivated attackers."

The findings, as well as other details and possible solutions are outlined in a research paper penned by Green, Zinkus, and Jois.



28 Comments

StrangeDays 8 Years · 12986 comments

Fascinating... His 25-tweet thread is really interesting. It sounds like it comes down to the OS using the weaker encryption option on most of a device's relevant content in order to allow the software to do things in the background while your phone is locked -- using the decryption key stored in memory, which attackers have access to:

Most apps like to do things in the background, while your phone is locked. They read from files and generally do boring software things. When you protect files using the strongest protection class and the phone locks, the app can’t do this stuff.

elijahg 18 Years · 2842 comments

I wonder if this is intentional so Apple can keep telling its users their data is encrypted, which it is, but then also able to turn a blind eye to the hacks the law enforcement uses to dump the phone's contents. That way they don't get forced to put in an explicit backdoor, because there is a workaround. Either that, or Apple has been secretly forced to allow access and these encryption workarounds give the illusion of privacy and non-compliance with law enforcement bigwigs and yet they actually are bending, with this being the best way they've got to keep the agreement secret.

Rayz2016 8 Years · 6957 comments

Fascinating... His 25-tweet thread is really interesting. It sounds like it comes down to the OS using the weaker encryption option on most of a device's relevant content in order to allow the software to do things in the background while your phone is locked -- using the decryption key stored in memory, which attackers have access to:

Most apps like to do things in the background, while your phone is locked. They read from files and generally do boring software things. When you protect files using the strongest protection class and the phone locks, the app can’t do this stuff.

Indeed. 


The location data makes sense, but the other stuff he mentioned doesn’t really need to be unencrypted while no one’s looking at it. I’m wondering if it’s a change that was made to conserve power. 

Oh wait. Here’s something that happens in the background: indexing and processing for machine learning.  I reckon a lot of that gets done while the phone is locked and probably can’t be done without decrypting the data. 

dewme 10 Years · 5775 comments

elijahg said:
I wonder if this is intentional so Apple can keep telling its users their data is encrypted, which it is, but then also able to turn a blind eye to the hacks the law enforcement uses to dump the phone's contents. That way they don't get forced to put in an explicit backdoor, because there is a workaround. Either that, or Apple has been secretly forced to allow access and these encryption workarounds give the illusion of privacy and non-compliance with law enforcement bigwigs and yet they actually are bending, with this being the best way they've got to keep the agreement secret.

I think you are actually pointing in the right direction. Apple isn't stupid, and to believe that they are somehow being repeatedly "duped" by US and Israeli security experts despite their proclamations of providing "total security and privacy" for their customers is a little bit more than a stretch or the ultimate "oops." There is probably a game of Chicken going on between Apple and government agencies like the NSA. Apple knows that it could lock down their stuff in ways that would make life miserable for the NSA. At the same time Apple also knows if they actually did this all pretense of civility and private industry operating independently and without the heavy hand of government slapping them down would vanish. No matter how you want to spin this, there is no way that Apple (or any other private company) would come out as the "winner" in this struggle. The winners and losers in such a conflict are predetermined, so we'll all get to to witness these little theatrical performances for as long as it takes to avoid or at least delay the inevitable outcome. 

Gaby 6 Years · 194 comments

I’d be interested to know that if when you manually lock the phone down with a long press on sleep/wake + volume - which locks biometrics and necessitates password Re entry, if this is considered BFU or AFU. Technically it is AFU, but from what I remember Apple execs discussing, that is supposed to lock down the phone. In which case it is still feasible to lock people out without a power down. Hmmm....