Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

WhatsApp latest to pile on Apple over Child Safety tools

WhatsApp chief Will Cathcart on Friday dinged Apple's plan to roll out new Child Safety features that scan user photos to find offending images, the latest in a string of criticism from tech experts who suggest the system flouts user privacy.

Cathcart outlined his position in a series of tweets, saying Apple's plan to combat child sexual abuse material (CSAM) is a step in the wrong direction and represents a "setback for people's privacy all over the world."

"Apple has long needed to do more to fight CSAM, but the approach they are taking introduces something very concerning into the world," Cathcart said. "Instead of focusing on making it easy for people to report content that's shared with them, Apple has built software that can scan all the private photos on your phone — even photos you haven't shared with anyone. That's not privacy."

Announced on Thursday, Apple's tools will help identify and report CSAM by comparing hashes of images uploaded to iCloud against a database of known database provided by the National Center for Missing & Exploited Children (NCMEC) and other child safety organizations.

Before a photo is uploaded to iCloud Photos, a hash is generated and matched with the database, which Apple will transform into an unreadable set of hashes for storage on a user's device. The process results in a cryptographic safety voucher that is subsequently sent to iCloud along with the photo. Using a technique called threshold sharing, the system ensures voucher contents cannot be viewed by Apple unless an iCloud account surpasses a predefined threshold. Further, Apple can only interpret vouchers related to matching CSAM images.

Apple says the technique enables CSAM detection while affording end users a high level of privacy.

Cathcart also takes issue with another new feature that automatically blurs sexually explicit images in Messages if a user is under 17 years old. Additionally, parents can opt to be notified when a child under 13 years old sends or receives a message containing such material. Images are analyzed on-device with the help of machine learning.

"I read the information Apple put out yesterday and I'm concerned. I think this is the wrong approach and a setback for people's privacy all over the world," Cathcart said. "People have asked if we'll adopt this system for WhatsApp. The answer is no."

Cathcart said WhatsApp has worked hard to fight CSAM and last year reported more than 400,000 cases to NCMEC without breaking its encryption protocols.

It is perhaps unsurprising that WhatsApp, an arm of Facebook, is quick to bemoan Apple's initiative. Facebook is under threat from privacy-minded changes Apple delivered with iOS 14.5. Called App Tracking Transparency, the feature requires developers to ask users for permission before using Identification for Advertisers (IDFA) tags to track their activity across apps and the web. Facebook believes a majority of users will opt out of ad tracking, severely disrupting its main source revenue.

Along with Cathcart, other tech industry insiders have spoken out against Apple's child protection changes. Edward Snowden, Epic CEO Tim Sweeney, the Electronic Frontier Foundation and others have posited that the system, while well intentioned, could lead to abuse by governments or nefarious parties.

The uproar in large part stems from Apple's very public commitment to user privacy. Over the past few years the company has positioned itself as a champion of privacy and security, investing in advanced hardware and software features to forward those goals. Critics argue Apple's new Child Safety tools will tarnish that reputation.