Apple got more than it bargained for in its stand against government snooping. With the FBI keeping mum on methods used to extract data from an iPhone tied to last year's San Bernardino terror attack, Apple must patch a security hole it knows nothing about, a task one report suggests is made more difficult by a recent reorganization of its security team.
Citing current and former Apple employees, The New York Times reports the company's security operation has been in a state of transition since late last year. Directly applicable to the Department of Justice case, Dallas DeAtley, one of a handful of managers with experience in handling government requests for iPhone data, changed positions last year.
The report notes Apple previously staffed two security outfits in Core OS Security Engineering and a general product security team, the latter of which was divided into smaller groups responsible for encryption, anonymity and other privacy issues. In addition, the product security team included a reactive force that responded to threats discovered internally and by outside sources, while the so-called "RedTeam" worked proactively to ferret out potential device weaknesses.
According to former employees, the product security arm was divided sometime last year. The personal privacy team was assigned to a new manager, while other units, including the "RedTeam," moved under the Core OS Security Engineering umbrella and its former manager DeAtley.
How the transitionary period affected Apple's ability to discover exploits, issue patches and maintain product security is unknown, though a high rate of turnover is to be expected in high technology. As noted by The Times, security engineers are hot commodities, meaning Apple management likely anticipated a certain rate of attrition.
Apple, like many tech firms, is always on the lookout for fresh blood. The company has in the past poached engineers from rival corporations and is no stranger to making key acquisitions in efforts to stay ahead of the curve. For example, Apple last November hired two firmware security experts who ran "deep system security" startup LegbaCore, who helped develop a proof-of-concept Thunderbolt vulnerability dubbed Thunderstrike 2.
The government on Monday withdrew a California court order compelling Apple's assistance in unlocking an iPhone 5c used by San Bernardino terror suspect Syed Rizwan Farook. Apple resisted DOJ pressure, maintaining throughout that creating a software workaround put millions of iOS devices at risk of intrusion.
Federal prosecutors yesterday said an outside party approached the FBI with a viable data extraction method just days prior to a scheduled evidentiary hearing, rendering the case against Apple moot. An ABC News report on Tuesday cited one law enforcement source as saying the iPhone exploit came to light not despite the very public court case, but because of it.
It is unclear whether or not FBI officials will hand the working vulnerability over to Apple now that target data has been successfully extracted from Farook's iPhone, but chances are slim. A workable exploit — especially one inaccessible to Apple — is an invaluable digital forensics tool that might find use in multiple pending cases around the country. Apple a similar request for access in New York, for example.
For security researchers, privacy advocates and Apple, however, the mere existence of a workaround to built-in iOS device protections is a security disaster waiting to happen.
37 Comments
Given how fast this came about, I'm going to say the security company is going to keep it close to their chest because otherwise all the LEO (Law Enforcement Officers) around the world are going to want to know it.
After encryption, the best security guarantee is guaranteed self-destruction.
I believe the Feds - like the NSA/CIA have for years - used hardware intervention to crack the San Bernadino iPhone. Seems possible to embed command[s] in the cpu, graphics processor, that will detect any attempt to hardwire to internals and generate a commend that destroys significant links that enable further snooping.
They probably "patched" it already in the 5s and later phones, since fooling the retry count was the easiest way to get in.
That's the thing; if it was 5s, Apple could not have done a thing except maybe decap the secure enclave and do some deep shit there.
If they're willing to do that and spend half million dollar to do it, then it is probably national security and hey let them do it.
In this case, the 5c, the hack was easier and "only" 15K.
Any attempt to access the secure enclave should wipe the keys in there; the problem is that this kind of protection would normally make the chip much bigger. Could that fit in 5mm. This thing also has to be robust to not be triggered by accident.
I'd read they cloned the memory and basically "built" a new phone.
Not sure how anybody can get around that loophole. Unless the data has markets that tie it to chip identifiers.
And THAT would be serious biz.
This is GOOD for Apple. No Blackfoot and The fbi needs the physical iPhone to extract your data and this "exploit" only encourages Apple to make iOS even more secure.