Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple employees express concern over new child safety tools

Last updated

Apple employees are voicing concern over the company's new child safety features set to debut with iOS 15 this fall, with some saying the decision to roll out such tools could tarnish Apple's reputation as a bastion of user privacy.

Pushback against Apple's newly announced child safety measures now includes critics from its own ranks who are speaking out on the subject in internal Slack channels, reports Reuters.

Announced last week, Apple's suite of child protection tools includes on-device processes designed to detect and report child sexual abuse material uploaded to iCloud Photos. Another tool protects children from sensitive images sent through Messages, while Siri and Search will be updated with resources to deal with potentially unsafe situations.

Since the unveiling of Apple's CSAM measures, employees have posted more than 800 messages to a Slack channel on the topic that has remained active for days, the report said. Those concerned about the upcoming rollout cite common worries pertaining to potential government exploitation, a theoretical possibility that Apple deemed highly unlikely in a new support document and statements to the media this week.

The pushback within Apple, at least as it pertains to the Slack threads, appears to be coming from employees who are not part of the company's lead security and privacy teams, the report said. Those working in the security field did not appear to be "major complainants" in the posts, according to Reuters sources, and some defended Apple's position by saying the new systems are a reasonable response to CSAM.

In a thread dedicated to the upcoming photo "scanning" feature (the tool matches image hashes against a hashed database of known CSAM), some workers have objected to the criticism, while others say Slack is not the forum for such discussions, the report said. Some employees expressed hope that the on-device tools will herald full end-to-end iCloud encryption.

Apple is facing down a cacophony of condemnation from critics and privacy advocates who say the child safety protocols raise a number of red flags. While some of the pushback can be written off to misinformation stemming from a basic misunderstanding of Apple's CSAM technology, others raise legitimate concerns of mission creep and violations of user privacy that were not initially addressed by the company.

The Cupertino tech giant has attempted to douse the fire by addressing commonly cited concerns in a FAQ published this week. Company executives are also making the media rounds to explain what Apple views as a privacy-minded solution to a particularly odious problem. Despite its best efforts, however, controversy remains.

Apple's CSAM detecting tool launch with iOS 15 this fall.



66 Comments

Ofer 8 Years · 270 comments

Sadly, even with this new tool, Apple is still the best game in town as far as user privacy is concerned. I’m a big fan of the company and have been using their products since the very first computer I’ve owned (anyone remember the Apple IIgs?). However, if they continue in this trajectory, I may have to start researching other options.

tylersdad 13 Years · 310 comments

The folks working in privacy and information security were probably told to dummy up. 

MisterKit 8 Years · 516 comments

So Apple scans our photo library for a hit on known child porn. Somebody at some point along the chain had to watch it and establish the library.

OctoMonkey 4 Years · 343 comments

Ofer said:
Sadly, even with this new tool, Apple is still the best game in town as far as user privacy is concerned. I’m a big fan of the company and have been using their products since the very first computer I’ve owned (anyone remember the Apple IIgs?). However, if they continue in this trajectory, I may have to start researching other options.

Yes, I remember the IIgs, still have one...  as well as my first Macintosh (a platinum Plus I purchased new back in '87).

Sadly, due to this CSAM scanning, I will almost certainly be moving away from Apple.  I have already disabled the auto-upgrade on our dozen, or so, iOS based devices.  When iOS 15 comes out, that will be the end of updates for us.  Since we don't use iCloud and rarely text, this "feature" would have little impact on us.  However, the very notion that a company (any company) believes it has the right to place what amounts to spyware on devices I own is completely unacceptable!  If this means eventually going back to a "dumb" phone, then so be it.  Even if Apple were to state it will not include this "feature", I would probably not trust them enough to believe them...  perhaps they would just do it anyway without telling anybody.

In fairness, I grew up consuming more than my fair share of dystopian novels and movies which might be influencing my position a bit.  :-)

mknelson 9 Years · 1148 comments

MisterKit said:
So Apple scans our photo library for a hit on known child porn. Somebody at some point along the chain had to watch it and establish the library.

That's not quite the process. It isn't viewing your library in the way some articles imply.

https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf

The hash list is provided by National Center for Missing and Exploited Children (NCMEC). The images or likely just the hashes would have been provided by prosecutors/police agencies/child welfare.

The article above describes the hash algorithm. The algorithm creates a hash value. The hash value computed for the photos in the iCloud library are compared to hash values in the list.

An account is only flagged if there is a certain number of such hash matches.