Apple's 'differential privacy' still collects too much specific data, study says
Apple's use of "differential privacy" — a method that inserts random noise into data as it's collected en masse — doesn't go far enough to protect personal information, a study suggested this week.
Apple's "privacy loss parameters" still allow too much specific data to slip through, according to the study (PDF link), highlighted by Wired and published by five researchers from the University of Southern California, Indiana University, and Tsinghua University. While both macOS and iOS 10 are said to have issues, the latter platform is believed to be the more problematic one.
Another concern is that Apple keeps its loss parameter — also known as its epsilon — secret, which means that the company could be changing it on the fly without any outside scrutiny.
"Apple's privacy loss parameters exceed the levels typically considered acceptable by the differential privacy research community," said USC professor Aleksandra Korolova.
macOS is said to have an epsilon of 6, while iOS 10 sits at 14. By comparison, Google claims the differential privacy system in Chrome has an epsilon of 2 in most cases, and a lifetime ceiling of 8 to 9. Google also open-sources related code, making it possible to doublecheck.
In response to the study, Apple said it disagrees with many points, such as to what degree it can correlate data with a particular person. The company insisted that it varies noise based on the type of data, and that the researchers simply combined epsilons for all types on the assumption it could be pieced together.
It also pointed to policies like time limits on data storage, the rejection of IP addresses, and the decision to make collection opt-in — referring to installation and setup screens where people can choose whether or not to share usage and diagnostics information.
The study found that the iOS 11 beta had an epsilon of 43, but that's likely because of normal testing designed to weed out bugs before the software's Sept. 19 launch.