Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple reportedly plans to make iOS detect child abuse photos

A security expert claims that Apple is about to announce photo identification tools that would identify child abuse images in iOS photo libraries.

Apple has previously removed individual apps from the App Store over child pornography concerns, but now it's said to be about to introduce such detection system wide. Using photo hashing, iPhones could identify Child Sexual Abuse Material (CSAM) on device.

Apple has not confirmed this and so far the sole source is Matthew Green, a cryptographer and associate professor at Johns Hopkins Information Security Institute.

According to Green, the plan is initially to be client-side — that is, have all of the detection done on a user's iPhone. He argues, however, that it's possible that it's the start of a process that leads to surveillance of data traffic sent and received from the phone.

"Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems," continues Green. "The ability to add scanning systems like this to E2E [end to end encryption] messaging systems has been a major 'ask' by law enforcement the world over."

"This sort of tool can be a boon for finding child pornography in people's phones," he said. "But imagine what it could do in the hands of an authoritarian government?"

Green who, with his cryptography students, has previously reported on how law enforcement may be able to break into iPhones. He and Johns Hopkins University have also previously worked with Apple to fix a security bug in Messages.



26 Comments

crowley 15 Years · 10431 comments

"This sort of tool can be a boon for finding child pornography in people's phones," he said. "But imagine what it could do in the hands of an authoritarian government?"

But it's not in the hands of an authoritarian government?  It's in the hands of Apple.  If an authoritarian government wanted to do something like this I have no doubt they'd be capable of doing it, I don't see how Apple going after child abusers is going to affect that.

tedz98 6 Years · 80 comments

I’m totally against child porn. But this capability is a double edged sword. Apple, which touts its strong belief in the privacy of its customers using its devices, to begin surveilling customer content sets up an unusual and potentially dangerous precedent. What undesirable content will be next to be monitored? Who gets notified when supposed undesirable content is identified? Who determines what constitutes undesirable content? What if governments demand unique monitoring capabilities if Apple wants to sell their products in their countries? Despite the universally agreed upon disgust towards child pornography, the freedom of the entire Apple ecosystem will be jeopardized if this capability is deployed. This is as good as giving government agencies “back door” access to iPhones - which Apple has vehemently opposed. Very good intentions with very dangerous side effects.

jdw 18 Years · 1457 comments

No software is perfect.  So if this gets installed on everyone's phones only to call the cops on the iPhone owner when the AI detects what it thinks is a naughty photo, imagine the melt-down when it gets it wrong.  But even if it gets it right, the "concept" of have AI scan the photo for naughty content and then secretly call the cops can lead to an Orwellian scenario real fast.

The biggest problem with this story is how short it is on details.  We really need to know how it will be used and what safeguards are put in place to protect privacy.  We can ASSUME Apple has that covered, but I'll believe it when I see the details.

CloudTalkin 5 Years · 916 comments

crowley said:
"This sort of tool can be a boon for finding child pornography in people's phones," he said. "But imagine what it could do in the hands of an authoritarian government?"
But it's not in the hands of an authoritarian government?  It's in the hands of Apple.  If an authoritarian government wanted to do something like this I have no doubt they'd be capable of doing it, I don't see how Apple going after child abusers is going to affect that.

Coupla few things. 
1. It's not in the hands of Apple.  It's not in the hands of anyone.  It's a, thus far, unsubstantiated rumor from a security researcher. 

2. If it comes to fruition that Apple does enable the AI feature, wouldn't they be bound by the law to report the info to authorities (idk, ianal).  If the offending data is stored in iCloud, then it would also be subject to worldwide government data requests.  Requests that Apple has honored ~80% of the time on average.  

3. Keeping in mind this is only a claim by a researcher,  and not Apple, the question would then have to be asked: What constitutes child pornography to the AI?  Is it reviewed by a human for higher level verification?  If so, Apple employee or 3rd party source (like the original voice recordings)?  What triggers reporting to authorities and who bears responsibility for errors?

A parent sending pics of the kids in bubble bath to grandparents.  Photo of a young looking 18 girl topless at a nude beach.  Scouts shirtless around a campfire.  
Would any one of those trigger the AI?  What if all three were on the same phone?  It's entirely possible and not far fetched.  

I can't stress enough this isn't Apple going after child abusers.  This is a researcher making a claim.  But if Apple were going to do so it would most definitely affect that "government access -authoritarian or otherwise- query made by the researcher, in myriad way not even addressed in my comment.