Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple slammed for not doing enough to prevent CSAM distribution

iMessage can warn minors about nudity in photos

Apple and Microsoft have provided details of their methods for detecting or preventing child sexual abuse material distribution, and an Australian regulator has found their efforts lacking.

The Australian e-Safety Commissioner demanded that major tech firms like Apple, Facebook, Snapchat, Microsoft, and others detail their methods for preventing child abuse and exploitation on their platforms. The demand was made on August 30, and the companies had 29 days to comply or face fines.

Apple and Microsoft are the first companies to receive scrutiny from this review, and according to Reuters, the Australian regulator found their efforts insufficient. The two companies do not proactively scan user files on iCloud or OneDrive for CSAM, nor are there algorithmic detections in place for FaceTime or Skype.

Commissioner Julie Inman Grant called the company's responses "alarming." She stated that there was "clearly inadequate and inconsistent use of widely available technology to detect child abuse material and grooming."

Apple recently announced that it had abandoned its plans to scan photos being uploaded to iCloud for CSAM. This method would have been increasingly less effective since users now have access to fully end-to-end encrypted photo storage and backups.

Instead of scanning for existing content being stored or distributed, Apple has decided to take a different approach that will evolve over time. Currently, devices used by children can be configured by parents to alert the child if nudity is detected in photos being sent over iMessage.

Apple plans on expanding this capability to detect material in videos, then move the detection and warning system into FaceTime and other Apple apps. Eventually, the company hopes to create an API that would let developers use the detection system in their apps as well.

Abandoning CSAM detection in iCloud has been celebrated by privacy advocates. At the same time, it has has been condemned by child safety groups, law enforcement, and governmental officials.