Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

How Apple's walled garden iPhone security can help hackers evade scrutiny

Apple locks down its devices to prevent malware, but it's impossible to prevent everything

Apple has a secure mobile ecosystem because of choices in hardware and software that it has made, but the same systems and policies that keep most hackers out could be dramatically helping those few who can beat it.

Apple not only publicly champions privacy, it locks down its devices to protect security — even if some still say it's not enough. Now researchers are saying that this security actively benefits the small proportion of hackers who are able to defeat it.

"It's a double-edged sword," senior researcher Bill Marczak of cybersecurity firm Citizen Lab said to MIT's Technology Review. "You're going to keep out a lot of the riffraff by making it harder to break iPhones. But the 1% of top hackers are going to find a way in and, once they're inside, the impenetrable fortress of the iPhone protects them."

Marczak says that, for example, none of his team's systems could initially find any evidence of hacking on Al Jazeera journalist Tamar Almisshal's iPhone. Only by looking at the phone's internet traffic were they able to identify a connection to servers owned by Israeli hacking company NSO Group.

What's more, Apple's ongoing efforts to improve security derailed an investigation Marczak was conducting in 2020. Unspecified updates to iOS reportedly disabled a jailbreaking tool that Citizen Lab was using. It prevented the tool examining a particular update folder, which Marczak says is where hackers were hiding their code.

"We just kind of threw our hands up," he said. "We can't get anything from this — there's just no way."

Searching for evidence of malware

According to Technology Review, another security firm works by looking for indirect clues. Trail of Bits security engineer Ryan Storz uses an Apple-approved app called iVerify, which looks for anomalies like unexplained file changes.

He refers to it as having a tripwire, meaning you can't observe malware, but you can detect it in action. "As we lock these things down," he said, "you reduce the damage of malware and spying."

Security researcher Patrick Wardle has previously described the increased security Apple has brought to the Mac with Apple Silicon, and he told Technology Review that this adds to the overall protection of Apple devices.

"[Apple's] iOS is incredibly secure," he said. "Apple saw the benefits and has been moving them over to the Mac for a long time, and the M1 chip is a huge step in that direction."

"[However, security] tools are completely blind, and adversaries know this," he continued.

Aaron Corkerill of Lookout mobile security thinks that even so, Apple will continue to lock down devices, and so will other manufacturers.

"Android is increasingly locked down," he said. "We expect both Macs and ultimately Windows will increasingly look like the opaque iPhone model."

"We endorse that from a security perspective," he continues, "but it comes with challenges of opacity."

"I personally believe the world is marching toward this," says Ryan Storz. "We are going to a place where only outliers will have computers — people who need them, like developers."

"The general population will have mobile devices which are already in the walled-garden paradigm," he continues. "That will expand. You'll be an outlier if you're not in the walled garden."

An Apple spokesperson told Technology Review that the company believes it is pursuing the correct balance between usability and security.

Separately, Apple recently responded to Citizen Labs' report of vulnerabilities in iMessage by completely rewriting the app and service to remove the issue. As released in iOS 14, a new sandbox called Blastdoor parses all untrusted data, analyzing it for issues that may affect users.



12 Comments

lkrupp 19 Years · 10521 comments

I read every word of this article trying to understand what it says. The bottom line seems to be that security researchers are now saying that Apple’s drive for security and privacy is actually helping the bad guys because they, the supposed good guys, can’t get into iOS to see what's going on. 

"[Apple's] iOS is incredibly secure," he said. "Apple saw the benefits and has been moving them over to the Mac for a long time, and the M1 chip is a huge step in that direction.”

Are the good guys asking for a backdoor like the government so they can do their poking around?

MplsP 8 Years · 4047 comments

lkrupp said:
I read every word of this article trying to understand what it says. The bottom line seems to be that security researchers are now saying that Apple’s drive for security and privacy is actually helping the bad guys because they, the supposed good guys, can’t get into iOS to see what's going on. 

"[Apple's] iOS is incredibly secure," he said. "Apple saw the benefits and has been moving them over to the Mac for a long time, and the M1 chip is a huge step in that direction.”

Are the good guys asking for a backdoor like the government so they can do their poking around?

I think it can be summed up by the line "it's a double edged sword" 

Ultimately, the good guys were using a sort of back door to look for signs of infiltration but that backdoor got locked so now no one can use it. Good if it keeps a bad guy out, but bad if the bad guy finds another way in and the good guys can't see it any more.

It's just another example of the cat and mouse game of security.

crowley 15 Years · 10431 comments

lkrupp said:
I read every word of this article trying to understand what it says. The bottom line seems to be that security researchers are now saying that Apple’s drive for security and privacy is actually helping the bad guys because they, the supposed good guys, can’t get into iOS to see what's going on. 

"[Apple's] iOS is incredibly secure," he said. "Apple saw the benefits and has been moving them over to the Mac for a long time, and the M1 chip is a huge step in that direction.”

Are the good guys asking for a backdoor like the government so they can do their poking around?

They're not asking for anything, it's just an observation.  Hiding the keys and the locks makes it harder for people to break in, but anyone who works out where the hiding place is are sitting pretty, and since everything is hidden it's very hard to even see them entering or leaving.

larryjw 9 Years · 1036 comments

This suggests an interesting sideeffect of security. The more secure the less able to detect a security breach. 

Now, that is obvious from a sociological standpoint -- who monitors the monitors.

But, even from a technological perspective? Are we into the area of non-computable functions? Goedel's incompleteness theorem?

dewme 10 Years · 5775 comments

Lots of insightful comments above, especially asking the question "who monitors the monitors." This is actually addressing some of the challenges within the domain of information security (InfoSec), which is still under the cybersecurity umbrella but with greater focus on the integrity of what goes on on the inside of the walled garden. In my own experience matters of InfoSec become a very large concern when you have a distributed development organization. To be clear, it's not because anyone who is on the inside is any more or less trustworthy in one location over another, but because the potential for damage in the event of a breach can vary significantly between locations based on external factors, such as some governments insisting on easier access to what's inside the walled garden or cultural tolerance to bypassing security & privacy measures. One would hope that all of the gates into the walled garden are secured with equally strong locks, but the larger and more expansive your development footprint, the more difficult it is to ensure that this is always the case.