Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Child safety watchdog accuses Apple of hiding real CSAM figures

Apple cancelled its major CSAM proposals but introduced features such as automatic blocking of nudity sent to children

A child protection organization says it has found more cases of abuse images on Apple platforms in the UK than Apple has reported globally.

In 2022, Apple abandoned its plans for Child Sexual Abuse Material (CSAM) detection, following allegations that it would ultimately be used for surveillance of all users. The company switched to a set of features it calls Communication Safety, which is what blurs nude photos sent to children.

According to The Guardian newspaper, the UK's National Society for the Prevention of Cruelty to Children (NSPCC) says Apple is vastly undercounting incidents of CSAM in services such as iCloud, FaceTime and iMessage. All US technology firms are required to report detected cases of CSAM to the National Center for Missing & Exploited Children (NCMEC), and in 2023, Apple made 267 reports.

Those reports purported to be for CSAM detection globally. But the UK's NSPCC has independently found that Apple was implicated in 337 offenses between April 2022 and March 2023 — in England and Wales alone.

"There is a concerning discrepancy between the number of UK child abuse image crimes taking place on Apple's services and the almost negligible number of global reports of abuse content they make to authorities," Richard Collard, head of child safety online policy at the NSPCC said. "Apple is clearly behind many of their peers in tackling child sexual abuse when all tech firms should be investing in safety and preparing for the roll out of the Online Safety Act in the UK."

In comparison to Apple, Google reported over 1,470,958 cases in 2023. For the same period, Meta reported 17,838,422 cases on Facebook, and 11,430,007 on Instagram.

Apple is unable to see the contents of users iMessages, as it is an encrypted service. But the NCMEC notes that Meta's WhatsApp is also encrypted, yet Meta reported around 1,389,618 suspected CSAM cases in 2023.

In response to the allegations, Apple reportedly referred The Guardian only to its previous statements about overall user privacy.

Reportedly, some child abuse experts are concerned about AI-generated CSAM images. The forthcoming Apple Intelligence will not create photorealistic images.



12 Comments

timpetus 7 Years · 58 comments

More pressure to take the privacy poison pill. Stand firm, Apple.

y2an 15 Years · 231 comments

This is chalk and cheese. Facebook and Google both have social media platforms hosting user content for consumption by others. Apple does not. Where is the recognition of that rather significant difference in the “findings“?

hammeroftruth 16 Years · 1356 comments

y2an said:
This is chalk and cheese. Facebook and Google both have social media platforms hosting user content for consumption by others. Apple does not. Where is the recognition of that rather significant difference in the “findings“?

More importantly, what has been done about the findings? Were there investigations and what was the result? How many of those numbers were actual child abuse images and how many of those resulted in actions from law enforcement? What does NCMEC actually do and have done about this issue in general?

9secondkox2 8 Years · 3148 comments

There are bad people out there. 

That is not an excuse to go full draconian and spy on the law abiding good people. 

No reporting system is perfect. There are likely more than reported just like with other evil things.

but don't try to turn that into a movement to treat the good guys like criminals, losing our freedoms, privacy, and dignity in the process.

gatorguy 13 Years · 24627 comments

y2an said:
This is chalk and cheese. Facebook and Google both have social media platforms hosting user content for consumption by others. Apple does not. Where is the recognition of that rather significant difference in the “findings“?

Google does have YouTube, where user content can be shared, but direct interactions with specific people are not a strong point.  I would not call it a "social media platform" in the vein of a Facebook or Snapchat. Not that they haven't tried. :/

Of note, PatentlyApple has an expanded viewpoint. 
https://www.patentlyapple.com/2023/06/apple-was-once-a-leader-in-scanning-message-apps-for-child-pornographic-images-now-joins-a-group-to-protect-encryption-ov.html