Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Activists rally at Apple Park for reinstatement of child safety features

Image credit: Brooke Anderson (@MovementPhotog on X)

Protesters at Apple Park are demanding that the company reinstate its recently abandoned child safety measures.

Nearly three dozen protesters gathered around Apple Park on Monday morning, carrying signs that read, "Why Won't you Delete Child Abuse?" and "Build a future without abuse."

The group, known as Heat Initiative, is demanding that the iPhone developer bring back its child sexual abuse material (CSAM) detection tool.

"We don't want to be here, but we feel like we have to," Heat Initiative CEO Sarah Gardner told Silicon Valley. "This is what it's going to take to get people's attention and get Apple to focus more on protecting children on their platform."

In 2021, Apple had planned to roll out a new feature capable of comparing iCloud photos to a known database of CSAM images. If a match were found, Apple would review the image and report any relevant findings to NCMEC. This agency works as a reporting center for child abuse material that works with law enforcement agencies around the U.S.

After significant backlash from multiple groups, the company paused the program in mid-December of the same year, and then ultimately abandoned the plan entirely the following December.

Diagram showing Apple's process for detecting CSAM in photos. On-device matching, followed by safety vouchers, uploading to Apple, threshold check, then review by Apple, and report to NCMEC. Apple's abandoned CSAM detection tool

When asked why, Apple said the program would "create new threat vectors for data thieves to find and exploit," and worried that these vectors would compromise security, a topic that Apple prides itself in taking very seriously.

While Apple never officially rolled out its CSAM detection system, it did roll out a feature that alerts children when they receive or attempt to send content containing nudity in Messages, AirDrop, FaceTime video messages and other apps.

However, protesters don't feel like this system does enough to hold predators accountable for possessing CSAM and would rather it reinstate its formerly abandoned system.

"We're trying to engage in a dialog with Apple to implement these changes," protester Christine Almadjian said on Monday. "They don't feel like these are necessary actions."

This isn't the first time that Apple has butt heads with Heat Initiative, either. In 2023, the group launched a multi-million dollar campaign against Apple.



11 Comments

beowulfschmidt 12 Years · 2361 comments

In 2021, Apple had planned to roll out a new feature capable of comparing iCloud photos to a known database of CSAM images.

As I recall, one of the major bones of contention was that the system was going to scan photos destined for iClound, but not yet actually there, and was using resources on the user's phone itself, and not using iCloud resources, to do the actual scanning.  There were also, if I remember correctly, concerns about mistakes, as happened with some Google attempt to do the same thing flagging a file with a single character in it as problematic and locking an account.

foregoneconclusion 12 Years · 2857 comments

beowulfschmidt said: As I recall, one of the major bones of contention was that the system was going to scan photos destined for iClound, but not yet actually there, and was using resources on the user's phone itself, and not using iCloud resources, to do the actual scanning.  There were also, if I remember correctly, concerns about mistakes, as happened with some Google attempt to do the same thing flagging a file with a single character in it as problematic and locking an account.

It was a controversy that didn't really make much sense. The files that would be scanned were the same regardless of whether the scan happened in the cloud or on the phone itself. The user would choose whether or not to use iCloud for file backup. If they did choose to do so they would have to agree to Apple's terms for iCloud (which always include file scanning) and then choose which applications would have files backed up. Only the files from the applications that the user chose to use with iCloud would be scanned. So there's no actual difference to the files being scanned on the phone or in the cloud. Nothing would change in terms of what files were being scanned. 

DAalseth 6 Years · 3067 comments

Maybe I’m a cynic, but this protest strikes me more like Astroturfing by some group with an agenda than a real protest. 

chasm 10 Years · 3624 comments

If these folks are seriously concerned about preventing CSAM distribution, why aren’t they permanently encamped at Meta, Xitter, and Google’s headquarters?

I mean, if they are serious about trying to trample constitutional limits on the right to privacy and are pro-preemptive state searching without the presumption of innocence, they’d be much more likely to get further with those companies (who handle far more volume of cloud transportation of data than Apple anyway).

gatorguy 13 Years · 24627 comments

chasm said:
If these folks are seriously concerned about preventing CSAM distribution, why aren’t they permanently encamped at Meta, Xitter, and Google’s headquarters?

I mean, if they are serious about trying to trample constitutional limits on the right to privacy and are pro-preemptive state searching without the presumption of innocence, they’d be much more likely to get further with those companies (who handle far more volume of cloud transportation of data than Apple anyway).

https://protectingchildren.google/#fighting-abuse-on-our-own-platform-and-services

Apple wants to do the right thing, and had everything in place to do so (perhaps going one step too far) but is struggling with how to do so now without stirring up the minions.