WhatsApp chief Will Cathcart on Friday dinged Apple's plan to roll out new Child Safety features that scan user photos to find offending images, the latest in a string of criticism from tech experts who suggest the system flouts user privacy.
Cathcart outlined his position in a series of tweets, saying Apple's plan to combat child sexual abuse material (CSAM) is a step in the wrong direction and represents a "setback for people's privacy all over the world."
"Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world," Cathcart said. "Instead of focusing on making it easy for people to report content that's shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven't shared with anyone. That's not privacy."
Announced on Thursday, Apple's tools will help identify and report CSAM by comparing hashes of images uploaded to iCloud against a database of known database provided by the National Center for Missing & Exploited Children (NCMEC) and other child safety organizations.
Before a photo is uploaded to iCloud Photos, a hash is generated and matched with the database, which Apple will transform into an unreadable set of hashes for storage on a user's device. The process results in a cryptographic safety voucher that is subsequently sent to iCloud along with the photo. Using a technique called threshold sharing, the system ensures voucher contents cannot be viewed by Apple unless an iCloud account surpasses a predefined threshold. Further, Apple can only interpret vouchers related to matching CSAM images.
Apple says the technique enables CSAM detection while affording end users a high level of privacy.
Cathcart also takes issue with another new feature that automatically blurs sexually explicit images in Messages if a user is under 17 years old. Additionally, parents can opt to be notified when a child under 13 years old sends or receives a message containing such material. Images are analyzed on-device with the help of machine learning.
"I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world," Cathcart said. "People have asked if we'll adopt this system for WhatsApp. The answer is no."
Cathcart said WhatsApp has worked hard to fight CSAM and last year reported more than 400,000 cases to NCMEC without breaking its encryption protocols.
It is perhaps unsurprising that WhatsApp, an arm of Facebook, is quick to bemoan Apple's initiative. Facebook is under threat from privacy-minded changes Apple delivered with iOS 14.5. Called App Tracking Transparency, the feature requires developers to ask users for permission before using Identification for Advertisers (IDFA) tags to track their activity across apps and the web. Facebook believes a majority of users will opt out of ad tracking, severely disrupting its main source revenue.
Along with Cathcart, other tech industry insiders have spoken out against Apple's child protection changes. Edward Snowden, Epic CEO Tim Sweeney, the Electronic Frontier Foundation and others have posited that the system, while well intentioned, could lead to abuse by governments or nefarious parties.
The uproar in large part stems from Apple's very public commitment to user privacy. Over the past few years the company has positioned itself as a champion of privacy and security, investing in advanced hardware and software features to forward those goals. Critics argue Apple's new Child Safety tools will tarnish that reputation.
25 Comments
When even the likes of WhatsApp are piling on with regards to privacy, you know you’ve made a mistake.
I think Apple should take time and make sure they implement something like this after all the pros and cons are sorted out. As much as I admire their philosophy; it doesn’t mean they always have the correct solution. We all want to protect innocent children from unnecessary harm but if parents have to deal with claims that end up being deemed harmless children would still be affected negatively. Children could possibly be removed from homes by social services while these things play out.
There are some concerns to work out, but I applaud Apple for doing something to combat child sexual abuse materials. It’s only going to roll out in the US at first, use hashes from reputed sources, and requires multiple hash matches before being reported. The people and groups attacking Apple but haven’t done anything themselves to protect children or offer alternative measures. If you aren’t part of the solution, you are part of the problem, WhatsApp, EFF, and Epic.
Poster I Got A Bowl Cut sarcasm for the win.
Facebook(Whats App) are one of the worst if not the worst purveyors of private data exploitation. That's not up for debate though there will be those who will argue it (they hate the truth on it). But how ironic that Apple is the 'your private data is private' company yet they ceded that high ground to probably the biggest purveyors of private data exploitation. That's a mistake that one would think had to be comedy of errors.
Apple claims they, supposedly, have a data privacy certain way to check for disturbing kiddie porn images without changing the underlying privacy standards? Great, gee why not lead with that information and an easy to understand explanation as to why it is? Versus letting it all out at once that 'scanning your pics'(then fill in the creepy kiddie porn thing). Apple, use your friggin heads and explain the data privacy first! tomorrow or next week roll out "We are trying to help exploited children" . Come on this isn't rocket science, you are a privacy first company, so explain the data privacy angle first. Instead they sound like just another surveillance capitalism company -- where Facebook(Whats App) is the high ground for data privacy? Surreal....
I have to say as an iOS user (and Apple booster), I have a lot of conflicted feeling about this. I think the premise of NCMEC is flawed…scanning iCloud for a collection of known culd abuse photos (I am guessing that are 5-30 years old) will do little to prevent on-going child abuse. I realize that there may still be utility to removing these images from circulation, helping survivors cope etc) but I doubt that active child abuse will be thwarted by this program since by design it is backward looking.
And we have to deal with governments and bad actors planting these known photos on device to implicate the innocent, the slippery slope etc.
It seems to be that basic community policing will do more to prevent child abuse. In my community most of the notorious cases of child abuse were discovered by neighbors and the community. The cases were being perpetuated by the marginalized (strange religious beliefs that their children were devil-possessed etc) and I suspect this will do more to protect the children.