Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple provides detailed reasoning behind abandoning iPhone CSAM detection

Apple's scrapped CSAM detection tool

A child safety group pushed Apple on why the announced CSAM detection feature was abandoned, and the company has given its most detailed response yet as to why it backed off its plans.

Child Sexual Abuse Material is an ongoing severe concern Apple attempted to address with on-device and iCloud detection tools. These controversial tools were ultimately abandoned in December 2022, leaving more controversy in its wake.

A child safety group known as Heat Initiative told Apple it would organize a campaign against the choice to abandon CSAM detection, hoping to force it to offer such tools. Apple responded in detail, and Wired was sent the response and detailed its contents in a report.

The response focuses on consumer safety and privacy as Apple's reason to pivot to a feature set called Communication Safety. Trying to find a way to access information that is normally encrypted goes against Apple's wider privacy and security stance — a position that continues to upset world powers.

"Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it," wrote Erik Neuenschwander, Apple's director of user privacy and child safety.

"Scanning every user's privately stored iCloud data would create new threat vectors for data thieves to find and exploit," Neuenschwander continued. "It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types."

"We decided to not proceed with the proposal for a hybrid client-server approach to CSAM detection for iCloud Photos from a few years ago," he finished. "We concluded it was not practically possible to implement without ultimately imperiling the security and privacy of our users."

Neuenschwander was responding to a request by Sarah Gardner, leader of the Heat Initiative. Gardner was asking why Apple backed down on the on-device CSAM identification program.

"We firmly believe that the solution you unveiled not only positioned Apple as a global leader in user privacy but also promised to eradicate millions of child sexual abuse images and videos from iCloud," Gardner wrote. "I am a part of a developing initiative involving concerned child safety experts and advocates who intend to engage with you and your company, Apple, on your continued delay in implementing critical technology."

"Child sexual abuse is a difficult issue that no one wants to talk about, which is why it gets silenced and left behind, added Gardner. "We are here to make sure that doesn't happen."

Rather than take an approach that would violate user trust and make Apple a middleman for processing reports, the company wants to help direct victims to resources and law enforcement. Developer APIs that apps like Discord can use will help educate users and direct them to report offenders.

Apple doesn't "scan" user photos stored on device or in iCloud in any way. The Communication Safety feature can be enabled for children's accounts but won't notify parents when nudity is detected in chats.

The feature is being expanded to adults in iOS 17, which will enable users to filter out unwanted nude photos from iMessage. Apple hopes to expand these features to more areas in the future.



25 Comments

mark fearing 441 comments · 16 Years

Apple is not a law enforcement organization. That should answer the question. Period.

appleinsideruser 662 comments · 5 Years

Great that Apple heard the outcry of the slippery slope to snooping — they don’t often eat humble pie!

entropys 4316 comments · 13 Years

“Won’t someone please think of the children!”

seriously, the world is awash with people who in the name of good, do great harm. This is a gold medal example.

and good on Apple for realising that many peoples’ concerns with how this system could create a situation that could be exploited by bad actors, and further, adopted those concerns as its own position.

danox 3442 comments · 11 Years

Good luck with the dawn of AI, and being able to manipulate and create millions of pictures without any context, or intelligence behind them, just press the button and go, within 10 years what’s real and what’s not?

badmonk 1336 comments · 11 Years

They should also talk to parents who texted pictures of their children’s rash to the pediatrician and got caught up in CSAM investigations.  As bad as CSAM is, larger privacy issues are important as well.  Apple is doing the right thing here.