Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

EFF protesting Apple CSAM identification programs on Monday evening

An image of 'Apple surveillance' promoting the EFF protest

On Monday at 6:00 PM local time, the EFF and other privacy groups are protesting Apple's CSAM identification initiatives in person at select Apple Retail stores across the country.

The Electronic Frontier Foundation is sponsoring a nationwide protest of Apple's CSAM on-device protections it announced, then delayed, for iOS 15 and macOS Monterey. The protest is being held in several major US cities, including San Francisco, Atlanta, New York, Washington D.C., and Chicago.

A post from the EFF outlines the protest and simply tells Apple, "Don't scan our phones." The EFF has been one of the largest vocal entities against Apple's CSAM detection system that was meant to release with iOS 15, citing that the technology is no better than mass government surveillance.

"We're winning— but we can't let up the pressure," the EFF said in a blog post. "Apple has delayed their plan to install dangerous mass surveillance software onto their devices, but we need them to cancel the program entirely."

The in-person protests are only one avenue of attack the EFF has planned. The organization sent 60,000 petitions to Apple on September 7, cites over 90 organizations backing the movement, and plans to fly an aerial banner over Apple's campus during the "California Streaming" event.

Those interested can find a protest in their area, sign up for newsletters, and email Apple leadership directly from this website.

Cloud providers like Google and Microsoft already search for CSAM within their user's photo collections but in the cloud, versus on users' devices. The concern with Apple's implementation lies with where the processing takes place.

Apple designed it so the hash-matching against the CSAM database would take place on the iPhone before images were uploaded to iCloud. It says this implementation is safer and more private than trying to identify user's photos in the cloud server.

A series of poor messaging efforts from Apple, general confusion surrounding the technology, and concern about how it might be abused led Apple to delay the feature.



12 Comments

StrangeDays 12980 comments · 8 Years

Scanning on our devices, bad, scanning on servers, good? I guess it doesn't matter which way they do it, but it would have kept the positives-count private if done on device.

Either way, CSAM hash scanning for kids being raped has happened for years, and will continue to. Are they going to protest Dropbox, Google, Microsoft, Tumblr, Twitter, etc?

https://nakedsecurity.sophos.com/2020/01/09/apples-scanning-icloud-photos-for-child-abuse-images/

https://www.microsoft.com/en-us/photodna

https://protectingchildren.google/intl/en/

mcdave 1927 comments · 19 Years

I don’t get this. Other scanning systems can do whatever but Apple’s only reports anything if a significant collection of known CSAM is found and that’s worse?

maestro64 5029 comments · 19 Years

Anyone who think this is good idea of Apple scanning your phones for images fail to see the altruist down line bad effects this can have. 

I share this before it only works for images which are known about and stored in the CSAM data base these are image that sick people trade in. It can not catch newly created images since they are not in the database and a Hash has not been created. If it can catch new images not in the data base it looking for feature which could catch other nude image of young kids which is demand acceptable and it becomes an invasion of everyone's privacy. Once the tech is installed what going to stop bad actors for using it for other reasons.

No one can say it will only catch the bad guys, no one know that to be true since no one really know how it works. This will catch the stupid people not the people who are smart and dangerous and who are creating the original images.

You are free to give away your privacy, but your not free to give everyone else's away. I am glad to see what EFF is standing up.

crowley 10431 comments · 15 Years

maestro64 said:
It can not catch newly created images since they are not in the database and a Hash has not been created.

Yes, that's a limitation, but not a reason not to do it.

maestro64 said:
If it can catch new images not in the data base it looking for feature which could catch other nude image of young kids which is demand acceptable and it becomes an invasion of everyone's privacy.

It can't.

maestro64 said:
Once the tech is installed what going to stop bad actors for using it for other reasons.

Apple in the first, most obvious instance.  Though you'll need to be clearer about which bad actors you're talking about. 

maestro64 said:

No one can say it will only catch the bad guys, no one know that to be true since no one really know how it works. This will catch the stupid people not the people who are smart and dangerous and who are creating the original images.

It's been pretty comprehensively documented, so people do know how it works.  And if it catches some stupid paedophiles then that's good, I'm not precious about the IQ of my child molesters.

maestro64 said:

You are free to give away your privacy, but your not free to give everyone else's away. I am glad to see what EFF is standing up.

No one's privacy is being given away.  If you don't like it, don't use iCloud Photos.

StrangeDays 12980 comments · 8 Years

maestro64 said:
Anyone who think this is good idea of Apple scanning your phones for images fail to see the altruist down line bad effects this can have. 

I share this before it only works for images which are known about and stored in the CSAM data base these are image that sick people trade in. It can not catch newly created images since they are not in the database and a Hash has not been created. If it can catch new images not in the data base it looking for feature which could catch other nude image of young kids which is demand acceptable and it becomes an invasion of everyone's privacy. Once the tech is installed what going to stop bad actors for using it for other reasons.

No one can say it will only catch the bad guys, no one know that to be true since no one really know how it works. This will catch the stupid people not the people who are smart and dangerous and who are creating the original images.

You are free to give away your privacy, but your not free to give everyone else's away. I am glad to see what EFF is standing up.

Hard to understand what you’re saying…but how is it any different scanning for hashes of child rape on the server-side? Same exact scanning. The hand-waiving about “But what if government adds other images!” doesn’t make any sense since by that logic it could be added to the server-side CSAM scanning just as easily. Arguably easier, in fact, as it doesn’t require a software push like it does for on-device scanning. So what is the difference?

You can reject ANY scanning, server or on device, simply by declining to use iCloud Photos. Same for Google or Microsoft cloud offerings.