Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

New FAQ says Apple will refuse pressure to expand child safety tools beyond CSAM

Apple's new child protection feature

Last updated

Apple has published a response to privacy criticisms of its new iCloud Photos feature of scanning for child abuse images, saying it "will refuse" government pressures to infringe privacy.

Apple's suite of tools meant to protect children has caused mixed reactions from security and privacy experts, with some erroneously choosing to claim that Apple is abandoning its privacy stance. Now Apple has published a rebuttal in the form of a Frequently Asked Questions document.

"At Apple, our goal is to create technology that empowers people and enriches their lives — while helping them stay safe," says the full document. "We want to protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material (CSAM)."

"Since we announced these features, many stakeholders including privacy organizations and child safety organizations have expressed their support of this new solution," it continues, "and some have reached out with questions."

"What are the differences between communication safety in Messages and CSAM detection in iCloud Photos?" it asks. "These two features are not the same and do not use the same technology."

Apple emphasizes that the new features in Messages are "designed to give parents... additional tools to help protect their children." Images sent or received via Messages are analyzed on-device "and so [the feature] does not change the privacy assurances of Messages."

CSAM detection in iCloud Photos does not send information to Apple about "any photos other than those that match known CSAM images."

Much of the document details what AppleInsider broke out on Friday. However, there are a few points explicitly spelled out that weren't before.

First, a concern from privacy and security experts has been that this scanning of images on device could easily be extended to the benefit of authoritarian governments that demand Apple expand what it searches for.

"Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."

"Let us be clear," it continues, "this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government's request to expand it."

Apple's new publication on the topic comes after an open letter was sent, asking the company to reconsider its new features.

Second, while AppleInsider said this before based on commentary from Apple, the company has clarified in no uncertain terms that the feature does not work when iCloud Photos is turned off.



98 Comments

entropys 4316 comments · 13 Years

A concern from privacy and security experts has been that this scanning of images on device could easily be extended to the benefit of authoritarian governments that demand Apple expand what it searches for. 

"Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."


Riiiight.

Translation: Here at Apple, we might have created a back door, but we promise to only ever use it for good. Pinky swear!

JaiOh81 61 comments · 8 Years

entropys said:
A concern from privacy and security experts has been that this scanning of images on device could easily be extended to the benefit of authoritarian governments that demand Apple expand what it searches for. 

"Apple will refuse any such demands," says the FAQ document. "We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future."


Riiiight.

Translation: Here at Apple, we might have created a back door, but we promise to only ever use it for good. Pinky swear!

Came to say the same thing. I’m really disappointed in Apple. It’s unfortunate that our privacy is being  eroded even more. This is a really bad look for Apple but they know their customers have little choice because the other guys are worse. 

foregoneconclusion 2857 comments · 12 Years

entropys said: Riiiight.

Translation: Here at Apple, we might have created a back door, but we promise to only ever use it for good. Pinky swear!

Read the user agreement terms for iCloud and other cloud services: they have always had parameters for what is acceptable use of the service and have always reserved the right to screen files as a result. There has never been any "you can do whatever you want in the cloud and we'll never look at any files in the cloud" promise from any of these companies. 

The only people that think this is something new are people that never read the user agreements for cloud services. 

AlwaysWinter 6 comments · 3 Years

Their defense is that they would refuse authoritarian attempts but we’ve seen two instances so far where Apple couldn’t refuse. With iCloud in China and with FaceTime in Saudi Arabia. Setting the past aside what’s to say that the political or financial climate for Apple won’t change and make it harder to say no than it is now? There may come a time when they want to say no but can’t. 

foregoneconclusion 2857 comments · 12 Years

Their defense is that they would refuse authoritarian attempts but we’ve seen two instances so far where Apple couldn’t refuse. With iCloud in China and with FaceTime in Saudi Arabia. Setting the past aside what’s to say that the political or financial climate for Apple won’t change and make it harder to say no than it is now? There may come a time when they want to say no but can’t. 

The problem with hypotheticals is that they can be applied to anything and anyone. That's why lawsuits based on hypotheticals get thrown out of court, like Yahoo! suing the government over subpoenas for data. Yahoo! imagined all kinds of scenarios where handing over the data could be abused, but the court said "where's the proof the government is actually doing any of that with the data" and the case was thrown out.