Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

San Francisco doctor charged with possessing child pornography in iCloud

Credit: Apple

Last updated

Amid controversy surrounding Apple's CSAM detection system, a San Francisco Bay Area doctor has been charged with possessing child pornography in his Apple iCloud account, according to federal authorities.

The U.S. Department of Justice announced Thursday that Andrew Mollick, 58, had at least 2,000 sexually exploitative images and videos of children stored in his iCloud account. Mollick is an oncology specialist affiliated with several Bay Area medical facilitates, as well as an associate professor at UCSF School of Medicine.

Additionally, he uploaded one of the images to social media app Kik, according to the recently unsealed federal complaint (via KRON4).

Apple has recently announced plans to introduce a system designed to detect child sexual abuse material (CSAM) in iCloud and provide a report to the National Center for Missing and Exploited Children (NCMEC). The system, which relies on cryptographic techniques to ensure user privacy, has caused controversy among the digital rights and cybersecurity communities.

The system does not scan actual images on a user's iCloud account. Instead, it relies on matching hashes of images stored in iCloud to known CSAM hashes provided by at least two child safety organizations. There's also a threshold of at least 30 pieces of CSAM to help mitigate false positives.

Documents revealed during the Epic Games v. Apple trial indicated that Apple anti-fraud chief Eric Friedman thought that the Cupertino tech giant's services were the "greatest platform for distributing" CSAM. Friedman attributed that fact to Apple's strong stance on user privacy.

Despite the backlash, Apple is pressing forward with its plans to debut the CSAM detection system. It maintains that the platform will still preserve the privacy of users who do not have collections of CSAM on their iCloud accounts.



44 Comments

nizzard 14 Years · 58 comments

I’m ok with this.   What is NOT ok is them backdooring Messages (to start).

DAalseth 6 Years · 3067 comments

This says to me that they then have no reason to add any additional measures. They can already detect these images in iCloud. 

baconstang 10 Years · 1160 comments

Looks like they caught this jerk without installing spyware on his phone.

zimmie 9 Years · 651 comments

DAalseth said:
This says to me that they then have no reason to add any additional measures. They can already detect these images in iCloud. 

They can ... because images in iCloud aren't encrypted today. Apple's servers have the ability to see them.

With this new plan to scan images on end users' devices as they are being uploaded to iCloud, the images themselves can be encrypted in a way that Apple can't break. Each uploaded instance of CSAM includes a partial direction on how to find the key to decrypt the images. This is known as threshold secret sharing. They aren't exactly parts of the key, but once Apple has enough (apparently 30 in this case), they can use the directions to generate their own copy of the key. (Edited to clarify that last sentence.)

Today, Apple employees can poke through your photos and share ones they find interesting (presumably policy does not allow this, but they have the capability to). With the announced system in place, they would no longer be able to at a technical level.

sflocal 16 Years · 6138 comments

Looks like they caught this jerk without installing spyware on his phone.

Where did Apple say they were installing "spyware" on iPhones?  They are scanning iCloud photos in its datacenter, but nothing gets loaded on the phone itself.


Why do people continue pushing false stories like this?