Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Civil rights groups worldwide ask Apple to drop CSAM plans

Apple's new child protection feature

Last updated

More than 80 civil rights groups have sent an open letter to Apple, asking the company to abandon its child safety plans in Messages and Photos, fearing expansion of the technology by governments.

Following the German government's description of Apple's Child Sexual Abuse Material plans as surveillance, 85 organizations around the world have joined the protest. Groups including 28 US-based ones, have written to CEO Tim Cook.

"Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material (CSAM)," says the full letter, "we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children."

"Once this capability is built into Apple products, the company and its competitors will face enormous pressure — and potentially legal requirements — from governments around the world to scan photos not just for CSAM, but also for other images a government finds objectionable," it continues.

"Those images may be of human rights abuses, political protests, images companies have tagged as 'terrorist' or violent extremist content, or even unflattering images of the very politicians who will pressure the company to scan for them," says the letter.

"And that pressure could extend to all images stored on the device, not just those uploaded to iCloud. Thus, Apple will have laid the foundation for censorship, surveillance and persecution on a global basis."

Signatories on the letter have separately been promoting its criticisms, including the Electronic Frontier Foundation.

The letter concludes by urging Apple to abandon the new features. It also urges "Apple to more regularly consult with civil society groups," in future.

Apple has not responded to the letter. However, Apple's Craig Federighi has previously said that the company's child protection message was "jumbled," and "misunderstood."



48 Comments

cjlacz 16 Years · 47 comments

Little late for that now. Apple already opened that can of worms (if it is actually a problem) just by announcing it. I'd still kind of like to see this implemented. I think Apple has addressed this as best it can, and governments can mandate that companies scan for this stuff with laws regardless of any prior implementation. Tech has kind of created this problem of sharing these CSAM photos easily. I'd like to see them as part of the solution too.

mike_galloway 4 Years · 115 comments

cjlacz said:
Little late for that now. Apple already opened that can of worms (if it is actually a problem) just by announcing it. I'd still kind of like to see this implemented. I think Apple has addressed this as best it can, and governments can mandate that companies scan for this stuff with laws regardless of any prior implementation. Tech has kind of created this problem of sharing these CSAM photos easily. I'd like to see them as part of the solution too.

Agreed - our ability to say, post or store online what we want without consequence has gone too far. This has been caused by the tech companies so it’s really up to them to fix it.

foregoneconclusion 12 Years · 2857 comments

FYI to EFF: totalitarian governments already have their populations under surveillance. You're not thwarting totalitarianism by having Apple remove the CSAM hash scanning capability. Citizens of China, Russia, Turkey etc. use smartphones and are still under totalitarian control regardless. 

Apple_Bar 5 Years · 134 comments

FYI to EFF: totalitarian governments already have their populations under surveillance. You're not thwarting totalitarianism by having Apple remove the CSAM hash scanning capability. Citizens of China, Russia, Turkey etc. use smartphones and are still under totalitarian control regardless. 

So what you are saying is that since China, Russia and Turkey have their population under surveillance. Then countries under a Democracy shouldn’t express their VALID privacy concerns about the implementation of this scanning mechanism. 


People don’t forget that none of the people arguing about this implementation are against of protecting children or scanning CSAM material which btw already occurs on most cloud services…

is about on the wrong hands or even with government pressure what else can I scan on that device…

again not totalitarian governments because they just do it. I am talking about every other government.

It doesn’t matter how many times Apple will try to explain it. It has that feeling of being a slippery slope for privacy in general. 

tedz98 6 Years · 80 comments

Apple can do no wrong in the eyes of many. This new feature that Apple has developed is wrong. It’s a bad capability put to good use. The objective of reducing the transmission of CSAM is good. But it’s like plugging leaks in the proverbial dike. It makes the transmission of illicit content more difficult but If implemented it will just force the use of other pathways to move the content about. However the byproduct of this action- the scanning of content of people’s devices- will be disastrous. Now that governments know there is an ability for Apple to interrogate the content on people’s devices it won’t be long before governments require Apple to perform other types of content scanning on devices. Governments routinely require Apple to divulge iCloud content. That content is not encrypted. Users had the option of keeping content secured from government eyes by keeping content on their devices and out of iCloud. This capability will mark the beginning of the end of that security. This capability is totally at odds with Apple’s heretofore emphasis on the privacy and security of content on their devices. The law of unintended consequences is going to have a significant impact if this capability is implemented. This is an example of the old Ben Franklin adage about giving up some freedom to have better security and having neither as a result. I’m surprised that Apple leadership hasn’t thought through this decision better and I’m fairly sure the marketing department at Apple somehow sees this as being beneficial to the company and revenues - which I think is decidedly wrong.