Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple wipes on-device CSAM photo monitoring from site, but plans unchanged

Last updated

Apple removed all signs of its CSAM initiative from the Child Safety webpage on its website at some point overnight, but the company has made it clear that the program is still coming.

Apple announced in August that it would be introducing a collection of features to iOS and iPadOS to help protect children from predators and limit the spread of Child Sexual Abuse Material (CSAM). Following considerable criticism from the proposal, it appears that Apple is pulling away from the effort.

The Child Safety page on the Apple Website had a detailed overview of the inbound child safety tools, up until December 10. After that date, reports MacRumors, the references were wiped clean from the page.

It is unusual for Apple to completely remove mentions of a product or feature that it couldn't release to the public, as this behavior was famously on display for the AirPower charging pad. Such an action could be considered as Apple publicly giving up on the feature, though a second attempt in the future is always plausible.

Apple's features consisted of several elements, protecting children in a few ways. The main CSAM feature proposed using systems to detect abuse imagery stored within a user's Photos library. If a user had iCloud Photos on, this feature scanned a user's photos on-device and compared them against a hash database of known infringing material.

A second feature has already been implemented, intending to protect young users from seeing or sending sexually explicit photos within iMessage. Originally, this feature was going to notify parents when such an image was found, but as implemented, there is no external notification.

A third added updates to Siri and Search offering additional resources and guidance.

Of the three, the latter two made it into the release of iOS 15.2 on Monday. The more contentious CSAM element did not.

The Child Safety page on the Apple Website previously explained Apple was introducing the on-device CSAM scanning, in which it would compare image file hashes against a database of known CSAM image hashes. If a sufficient quantity of files were to be flagged as CSAM, Apple would have contacted the National Center for Missing and Exploited Children (NCMEC) on the matter.

The updated version of the page removes not only the section on CSAM detection, but also references in the page's introduction, the Siri and Search guidance, and a section offering links to PDFs explaining and assessing its CSAM process have been pulled.

On Wednesday afternoon, The Verge was told that Apple's plans were unchanged, and the pause for the re-think is still just that — a temporary delay.

Shortly after the announcement of its proposed tools in August, Apple had to respond to claims from security and privacy experts, as well as other critics, about the CSAM scanning itself. Critics viewed the system as being a privacy violation, and the start of a slippery slope that could've led to governments using an expanded form of it to effectively perform on-device surveillance.

Feedback came from a wide range of critics, from privacy advocates and some governments to a journalist association in Germany, Edward Snowden, and Bill Maher. Despite attempting to put forward its case for the feature, and an admittance it failed to properly communicate the feature, Apple postponed the launch in September to rethink the system.

Update December 15, 1:39 PM ET: Updated with Apple clarifying that its plans are unchanged.



18 Comments

[Deleted User] 0 comments · 6 Years

Bravo, paranoids. Thanks to you, iCloud and iCloud Drive won’t be end-to-end encrypted, after all. Bingo! /s

badmonk 1336 comments · 11 Years

I suspect CSAM screening will ultimately be performed and confined iCloud server side, like every other cloud based service has been doing for years (and not talked about it).

I always thought iCloud screened for CSAM, after all MSFT and Google have been doing it for years.

Freedom has its limits.

F_Kent_D 98 comments · 6 Years

I myself didn’t have any issues with any of it TBH. I don’t have child pornography on anything I own and neither does anyone I know so I have nothing to worry about. They never were going to physically look at every photo, these are hash scans for particular data in the file details itself, not the actual photos. I also allowed the notifications setting on my 11 year old Daughter’s phone to notify me of potential unacceptable messages being sent or received. Let the paranoid people kill what could actually help end the CP sickness that’s more of an issue today than ever. 

Cesar Battistini Maziero 410 comments · 8 Years

Great! And the pedos will keep getting away with it.