Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Internal Apple memo addresses public concern over new child protection features

Last updated

An internal memo from Apple reportedly addresses concerns around CSAM and Photo scanning features, aims to uphold its commitment to user privacy while also protecting children.

On Thursday, Apple announced that it would expand child safety features in iOS 15, iPadOS 15, macOS Monterey, and watchOS 8. The new tools include a system that leverages cryptographic techniques to detect collections of CSAM stored in iCloud Photos to provide information to law enforcement.

The announcement was met with a fair amount of pushback from customers and security experts alike. The most prevalent worry is that the implementation could at some point in the future lead to surveillance of data traffic sent and received from the phone.

An internal memo, obtained by 9to5Mac, addresses these concerns to Apple staff. The memo was reportedly penned by Sebastien Marineau-Mes, a software VP at Apple, and reads as follows:

Today marks the official public unveiling of Expanded Protections for Children, and I wanted to take a moment to thank each and every one of you for all of your hard work over the last few years. We would not have reached this milestone without your tireless dedication and resiliency.

Keeping children safe is such an important mission. In true Apple fashion, pursuing this goal has required deep cross-functional commitment, spanning Engineering, GA, HI, Legal, Product Marketing and PR. What we announced today is the product of this incredible collaboration, one that delivers tools to protect children, but also maintain Apple's deep commitment to user privacy.

We've seen many positive responses today. We know some people have misunderstandings, and more than a few are worried about the implications, but we will continue to explain and detail the features so people understand what we've built. And while a lot of hard work lays ahead to deliver the features in the next few months, I wanted to share this note that we received today from NCMEC. I found it incredibly motivating, and hope that you will as well.

The memo also included a message from Marita Rodrigues, the executive director of strategic partnerships at the National Center for Missing and Exploited Children.

Team Apple,

I wanted to share a note of encouragement to say that everyone at NCMEC is SO PROUD of each of you and the incredible decisions you have made in the name of prioritizing child protection.

It's been invigorating for our entire team to see (and play a small role in) what you unveiled today.

I know it's been a long day and that many of you probably haven't slept in 24 hours. We know that the days to come will be filled with the screeching voices of the minority.

Our voices will be louder.

Our commitment to lift up kids who have lived through the most unimaginable abuse and victimizations will be stronger.

During these long days and sleepless nights, I hope you take solace in knowing that because of you many thousands of sexually exploited victimized children will be rescued, and will get a chance at healing and the childhood they deserve.

Thank you for finding a path forward for child protection while preserving privacy.

Apple has yet to make a public-facing statement addressing the backlash.

Apple has long been viewed as a champion in user privacy. At WWDC 2021, Apple unveiled plans to radically expand privacy features across the Apple Ecosystem.



60 Comments

Beats 5 Years · 3073 comments

Some of this sounds like PR BS. I don’t see how this helps children like Tim claims.  And Apple collaborating with the government is embarrassing.

6 Likes · 0 Dislikes
tylersdad 14 Years · 310 comments

This is monumentally bad for privacy. It's making me reconsider my investments in Apple products. My entire family belongs to the Apple ecosystem. We all have some version of the iPhone 12, iPads and Apple Watches. It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end? 

13 Likes · 0 Dislikes
mknelson 10 Years · 1151 comments

Beats said:
Some of this sounds like PR BS. I don’t see how this helps children like Tim claims.  And Apple collaborating with the government is embarrassing.

Collaborating with the government how? (you are the government in a democracy btw).

5 Likes · 0 Dislikes
newisneverenough 7 Years · 46 comments

* Surveillance By Apple * is wrong. I hope this is somehow stopped. Who would have thought privacy focused Apple could be so exceptionally stupid. 

6 Likes · 0 Dislikes
Wesley Hilliard 5 Years · 330 comments

tylersdad said:
It starts with examining personal pictures ostensibly to prevent child exploitation, but where does it lead? Where does it end?

Apple isn't examining personal pictures. No one is examining anything. Your photo has a unique number based on how the pixels are laid out, that number is compared to a database of numbers representing images for an exact match. False positives are one in a trillion. This is overly-simplified, but that's the basic overview. There isn't some algorithm looking for nudity in images.


Where does it end? It already has ended. The tool exists, it took years to develop, it is rolling out this fall. There is no "next."

11 Likes · 0 Dislikes