A security expert claims that Apple is about to announce photo identification tools that would identify child abuse images in iOS photo libraries.
Apple has previously removed individual apps from the App Store over child pornography concerns, but now it's said to be about to introduce such detection system wide. Using photo hashing, iPhones could identify Child Sexual Abuse Material (CSAM) on device.
Apple has not confirmed this and so far the sole source is Matthew Green, a cryptographer and associate professor at Johns Hopkins Information Security Institute.
I've had independent confirmation from multiple people that Apple is releasing a client-side tool for CSAM scanning tomorrow. This is a really bad idea.
— Matthew Green (@matthew_d_green) August 4, 2021
According to Green, the plan is initially to be client-side — that is, have all of the detection done on a user's iPhone. He argues, however, that it's possible that it's the start of a process that leads to surveillance of data traffic sent and received from the phone.
"Eventually it could be a key ingredient in adding surveillance to encrypted messaging systems," continues Green. "The ability to add scanning systems like this to E2E [end to end encryption] messaging systems has been a major 'ask' by law enforcement the world over."
"This sort of tool can be a boon for finding child pornography in people's phones," he said. "But imagine what it could do in the hands of an authoritarian government?"
Green who, with his cryptography students, has previously reported on how law enforcement may be able to break into iPhones. He and Johns Hopkins University have also previously worked with Apple to fix a security bug in Messages.
26 Comments
I’m totally against child porn. But this capability is a double edged sword. Apple, which touts its strong belief in the privacy of its customers using its devices, to begin surveilling customer content sets up an unusual and potentially dangerous precedent. What undesirable content will be next to be monitored? Who gets notified when supposed undesirable content is identified? Who determines what constitutes undesirable content? What if governments demand unique monitoring capabilities if Apple wants to sell their products in their countries? Despite the universally agreed upon disgust towards child pornography, the freedom of the entire Apple ecosystem will be jeopardized if this capability is deployed. This is as good as giving government agencies “back door” access to iPhones - which Apple has vehemently opposed. Very good intentions with very dangerous side effects.
Harvard professor says surveillance capitalism is undermining democracy – Harvard Gazette
No software is perfect. So if this gets installed on everyone's phones only to call the cops on the iPhone owner when the AI detects what it thinks is a naughty photo, imagine the melt-down when it gets it wrong. But even if it gets it right, the "concept" of have AI scan the photo for naughty content and then secretly call the cops can lead to an Orwellian scenario real fast.
The biggest problem with this story is how short it is on details. We really need to know how it will be used and what safeguards are put in place to protect privacy. We can ASSUME Apple has that covered, but I'll believe it when I see the details.