Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Open letter asks Apple not to implement Child Safety measures

Last updated

An open letter making the rounds online asks Apple to halt plans to roll out new Child Safety tools designed to combat child sexual abuse material, with signatories including industry experts and high-profile names like Edward Snowden.

The document, which reads more like an indictment than an open letter, offers a rundown of Apple's Thursday announcement that details upcoming features designed to detect CSAM.

A multi-pronged effort, Apple's system uses on-device processing to detect and report CSAM images uploaded to iCloud Photos, as well as protect children from sensitive images sent through Messages.

"While child exploitation is a serious problem, and while efforts to combat it are almost unquestionably well-intentioned, Apple's proposal introduces a backdoor that threatens to undermine fundamental privacy protections for all users of Apple products," the letter reads.

When it is implemented, Apple's system will hash and match user photos against a hashed database of known CSAM. The process is accomplished on-device before upload and only applies to images sent to iCloud. A second tool uses on-device machine learning to protect children under the age of 17 from viewing sexually explicit images in Messages. Parents can choose to be notified when children under 13 years old send or receive such content.

According to the letter, Apple's techniques pose an issue because they bypass end-to-end encryption.

"Because both checks are performed on the user's device, they have the potential to bypass any end-to-end encryption that would otherwise safeguard the user's privacy," the letter argues.

For its part, Apple has gone on record as saying the new safety protocols do not create a backdoor to its hardware and software privacy features.

The letter goes on to include commentary and criticism from a range of experts including Matthew Green, a cryptography professor at Johns Hopkins University who was among the first to voice concern over the implications of Apple's measures. Green and Snowden are counted among the signatories, which currently lists 19 organizations and 640 individuals who added their mark via GitHub.

Along with a halt to implementation, the letter requests that Apple issue a statement "reaffirming their commitment to end-to-end encryption and to user privacy."



91 Comments

DAalseth 7 Years · 3077 comments

No matter how well intentioned, this effort will be used to damage Apple's reputation, severely. It should be abandoned immediately. 
Remember how Apple was excoriated by some last year for having a "monopoly" on covid reporting Apps. and that was a free thing they did with Google and kept no data. Apple just stuck a big red Kick Me sign on their back. 

ArchStanton 4 Years · 200 comments

Not a big fan of some of the wording in the letter but I understand the meaning behind the larger message. IMHO Apple made a mistake at minimum in how they rolled this out, at minimum. DAalseth comment was correct, this will be attempted to be used to bludgeon Apple. How well it succeeds is doubtful, IMHO.  But the surreal irony is some of those doing the bludgeoning will be among the worst purveyors of data privacy collecting and exploitation.
No problem with the signers of the letter expressing this very important point. But Here's my problem with the signers of this letter: where the hell have they been on  the vast majority of smartphone users on the planet using a platform that was tracking the hell out of them? They've just been given a big platform to condemn user privacy issue  based on Apple's MEC surveilling, so where the hell is page 2 to protect hundreds of millions of people getting their privacy data tracked constantly? Speak now or show yourself to be looking for a few headlines. 

EFF has been there since day 1. Calling out Facebook and Google but also calling out Apple when it was needed. Where were the rest of these letter signers? Unfortunately a few of them probably, I suspect, getting "third party research" grants. See how that works?

iadlib 15 Years · 117 comments

This is admirable and I like the intention behind it. But to plays devils advocate. I bring up that this technique. Maybe. Possibly. Could be adapted for other kinds of data. Say emails. Text messages. Let’s say it becomes an API or baked in feature. What if that feature gets hijacked? By a hacker, or a government? To search for anti-state speech in China, or even industrial espionage. This is a Pandora’s box that couldn’t ever be shut. 

Rayz2016 9 Years · 6957 comments

DAalseth said:
No matter how well intentioned, this effort will be used to damage Apple's reputation, severely. It should be abandoned immediately. 
Remember how Apple was excoriated by some last year for having a "monopoly" on covid reporting Apps. and that was a free thing they did with Google and kept no data. Apple just stuck a big red Kick Me sign on their back. 

Apple will insist there is no back door into the system, but what they don’t realise is that this the back door. This is the back door that the governments have been asking for. All they need to do is add hashes from other databases (searching for pictures of dissidents, words from subversive poetry), tweak the threshold (you have to have four hits instead of eight) and you have an authoritarian government’s wet dream. It is the ultimate surveillance tool. 


More of a back passage than a back door, centrally controlled by Apple and law enforcement, allowing every phone to spy on its user. 

It’s odd but I’m typing this message on my iPad, and I have this notion that I no longer trust it, nor my iPhone, nor my Macs. I’m wary of them of them now. Even if Apple did reverse course (which they won’t), I don’t think that trust is coming back.