Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Inside iOS 10: Apple doubles down on security with cutting edge differential privacy

Last updated

After a bruising battle with the Federal Bureau of Investigation and a contentious debate over encryption in the wake of the San Bernadino terrorist shooting, Apple has doubled down on privacy protection by researching cutting-edge privacy techniques for iOS 10, allowing advanced new features while protecting user data.

"All of this great work in iOS 10 would be meaningless to us if it came at the expense of your privacy. And so in every feature that we do we carefully consider how to protect your privacy," Craig Federighi, Apple's senior vice president of Software Engineering, explained at this year's Worldwide Developers Conference.

In pursuit of advanced protection, Apple has invested in "differential privacy," a means of maximizing the accuracy of queries from statistical databases while at the same time minimizing the chances of identifying specific individuals. Differential privacy uses a variety of techniques to do this, like hashing, subsampling and noise injection.

Apple's work in differential privacy has earned the praise of Aaron Roth, a world-class expert on the subject, who called the technology in iOS 10 "groundbreaking."

When Aaron Roth, a world-class privacy researcher from the University of Pennsylvania, saw the company's efforts, he called them "groundbreaking" and said that scaling it up and incorporating it broadly into the technology "is visionary and positions Apple as the clear privacy leader among technology companies today."

The technique is important since Apple, like many companies, attempts to analyze device use to spot mass trends so that software improvements can be made — for example, discovering new words to include in QuickType.

In addition, Apple is continuing its practice of providing end-to-end encryption by default in apps like FaceTime, Messages and HomeKit to protect communications. When it comes to advanced, deep learning, artificial intelligence analysis of user data, it is being done on the device itself, keeping personal data under user control.

Federighi also pledged that Apple builds no user profiles based on internet searches.

"We believe you should have great features and great privacy," he said. "You demand it and we're dedicated to providing it."

Editor's note: This article was originally published in June following Apple's announcement of iOS 10 at WWDC 2016. It has been updated and republished to coincide with the mobile operating system's public release. For more on iOS 10, see AppleInsider's ongoing Inside iOS 10 series.



14 Comments

wdowell 15 Years · 235 comments

I've heard enough concerns over this Diffential Privacy threory being all that it's cracked up to be, to want to hear what the security community say about this. Right now I'd love to know if, given that it's possible for Apple to 'de-noise' the mass data, if the NSA and others could do the same? I havent a clue but there seems no answer as yet.

MacPro 18 Years · 19845 comments

wdowell said:
I've heard enough concerns over this Diffential Privacy threory being all that it's cracked up to be, to want to hear what the security community say about this. Right now I'd love to know if, given that it's possible for Apple to 'de-noise' the mass data, if the NSA and others could do the same? I havent a clue but there seems no answer as yet.

Good take on this here: http://www.theverge.com/2016/6/15/11940010/walt-mossberg-apple-wwdc-2016-recap-themes

bobolicious 10 Years · 1177 comments

"Apple builds no user profiles based on Internet searches." ...what criteria might be used...? And presuming that is indeed true, others may. And so why does Safari so frequently seem to exit of 'Private Browsing' mode when launching a 'new window' off a link...? Can this be an oversight, if long trails are often in browsing history without obvious warning... Does every 'upgrade' of these OS & apps introduce privacy creep 'features' that could potentially harvest & collate data if hacked or sociopolitical conditions change? I think of many things beyond a users control such as others use of contact photos & faces (id), iCloud (so ON by default), maps, and closer to home finger print access, location manager, etc. Why one must choose between location tracking or losing IMEI location tracking to wipe a users' data if a phone is lost or stolen ? Has this all happened so incrementally it is easy to view as helpful, but is the potential downside devastating? For consideration: http://www.dezeen.com/2015/05/27/rem-koolhaas-interview-technology-smart-systems-peoples-eagerness-sacrifice-privacy-totally-astonishing/ To quote a CNBC commentator ‘it’s a problem with our society’... http://video.cnbc.com/gallery/?video=3000524738 https://www.eff.org/wp/dangerous-terms-users-guide-eulas http://jolt.richmond.edu/v1i1/liberman.html

cpsro 14 Years · 3239 comments

Perhaps Congress would get off its collective @ss and mandate privacy protections like this.

Haha, just kidding.

revenant 15 Years · 610 comments

I find it extremely interesting/worrisome that governments like the US have pushed hard to get into private mobile phones but have been hacked into and lost data to hackers. The US DNC and US/Korean joint forces being the most recently successfully hacked. 
I notice governments want citizens to have weaker security/privacy while citizens are not fed up with governments loosing data for not being secure enough.