Apple's privacy chief Erik Neuenschwander has detailed some of the projections built into the company's CSAM scanning system that prevent it from being used for other purposes - including clarifying that the system performs no hashing if iCloud Photos is off.
The company's CSAM detection system, which was announced with other new child safety tools, has caused controversy. In response, Apple has offered numerous details about how it can scan for CSAM without endangering user privacy.
In an interview with TechCrunch, Apple privacy head Erik Neuenschwander said the system was designed from the start to prevent government overreach and abuse.
For one, the system only applies in the U.S., where Fourth Amendment protections already guard against illegal search and seizure.
"Well first, that is launching only for US, iCloud accounts, and so the hypotheticals seem to bring up generic countries or other countries that aren't the US when they speak in that way," Neuenschwander said "And therefore it seems to be the case that people agree US law doesn't offer these kinds of capabilities to our government."
But even beyond that, the system has baked-in guardrails. For example, the hash list that the system uses to tag CSAM is built into the operating system. It can't be updated from Apple's side without an iOS update. Apple also must release any updates to the database on a global scale — it can't target individual users with specific updates.
The system also only tags collections of known CSAM. A single image isn't going to trigger anything. More then that, images that aren't in the database provided by the National Center for Missing and Exploited Children won't get tagged either.
Apple also has a manual review process. If an iCloud account gets flagged for a collection of illegal CSAM material, an Apple team will review the flag to ensure that it's actually a correct match before any external entity is alerted.
"And so the hypothetical requires jumping over a lot of hoops, including having Apple change its internal process to refer material that is not illegal, like known CSAM and that we don't believe that there's a basis on which people will be able to make that request in the US," Neuenschwander said.
Additionally, Neuenschwander added, there is still some user choice here. The system only works if a user has iCloud Photos enabled. The Apple privacy chief said that, if a user doesn't like the system, "they can choose not to use iCloud Photos." If iCloud Photos is not enabled, "no part of the system is functional."
"If users are not using iCloud Photos, NeuralHash will not run and will not generate any vouchers. CSAM detection is a neural hash being compared against a database of the known CSAM hashes that are part of the operating system image," the Apple executive said. "None of that piece, nor any of the additional parts including the creation of the safety vouchers or the uploading of vouchers to iCloud Photos is functioning if you're not using iCloud Photos."
Although Apple's CSAM feature has caused a stir online, the company refutes that the system can be used for any purposes other than detecting CSAM. Apple clearly states that it will refuse any government attempt to modify or use the system for something other than CSAM.
62 Comments
Neuenschwander said "... people agree US law doesn't offer these kinds of capabilities to our government."
That is not completely true, people got discriminated against due to guilty by association. Imagine if a fan took a photo with a whistleblowers like Snowden. So yeah it can be abused by any country
Somehow I think this is a case of Apple being pressured by govt to do this.
It honestly ruins their supposed privacy protection standards. No one wants AI to go through their stuff. Usually probably cause standard would exist, now? entire new caselaw, use the cloud and you assume your own risk for things bring looked through? So today it is about this issue, what issue in the future will be the excuse to go through someone's files? suppose it is perils of using cloud storage, which is not private to begin with.
Seems like this will cause many to wake up to the fact that cloud storage is not remotely private or secure.
“Okay, guys, listen up. Here’s the brief. We’re going to bake a database of illegal porn image hashes into the operating system.”
Well, gotta give them credit for thinking different. No one else would’ve thought to do this … not on a full stomach anyway.
It would be helpful (rather than just trashing the offered fix from Apple) to offered an alternative solution - unless we wish to state that there is not a problem to solve.
Criticism is easy but solutions are difficult - buts lets try.