Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

German journalism association stokes fear over Apple CSAM initiative

Last updated

A German journalist's union has demanded that the European Commission step in over Apple's CSAM tools, believing that the system will be used to harvest contact information and perform other intrusions.

Apple's CSAM tools, intended to help fight the spread of illegal images of children, have courted controversy throughout August, as critics proclaim them to be an affront to privacy. The latest group to speak out about the supposed threat is, oddly, journalists in Germany, Austria, and Switzerland.

Journalist union DJV, representing writers in the country, believes that Apple "intends to monitor cell phones locally in the future." In a press release, the union calls the tools a "violation of the freedom of the press," and urges for the EU Commission and Austrian and German federal interior ministers to take action.

According to public editors association AGRA spokesman Hubert Krech, Apple has introduced "a tool with which a company wants to access other user data on their own devices, such as contracts and confidential documents," which is thought to be a violation of GDPR rules.

Frank Uberall, chairman of the DJV, adds it could be the first step of many. "Will images or videos of opponents of the regime or user data be checked at some point using an algorithm?" Uberall asks.

ORF editors council spokesman Dieter Bornemann offers a bleaker outlook, suggesting a government could check for images that could be evidence the user is involved in the LGBT community. It is also feared that totalitarian states could take advantage of the system's supposed capabilities.

The group also dismisses the claim that it will only apply in the United States, as most European media outlets have correspondents in the country. Furthermore, it is believed "What begins in the USA will certainly follow in Europe as well," the DJV states.

Misplaced concern

While the worry of having smartphones snooped by governments and security agencies can be well-founded in some cases, as with the Pegasus spying scandal, it seems DJV is overreaching with its claims of Apple's CSAM tools.

This is in part due to the nature of Apple's CSAM system in the first place. One part involves a scanning of hashes of images that are stored on iCloud Photos, checked against a database of existing CSAM images, rather than examining the image itself.

The second part is an on-device machine learning system for child accounts that have access to iMessage, one that doesn't compare against CSAM databases. In that element, the system doesn't report to Apple, only to the parental Family Sharing manager account.

Following the initial outcry from the public and critics, as well as a warped view of the system's capabilities into being potentially used by governments for surveillance purposes, Apple has attempted to set the record straight about the tools, with evidently limited success.

Apple privacy chief Erik Neuenschwander explained the CSAM detection system has numerous elements to prevent a single government from abusing it. Apple has also published support documents explaining the system in more detail, what it does, and how it is kept safe from interference.

Apple SVP of software engineering Crag Federighi said on Friday that the company was wrong to release the three child protection features at the same time, which led to a "jumbled" and "widely misunderstood" assessment of the system.

"I grant you, in hindsight, introducing these two features at the same time was a recipe for this kind of confusion," said Federighi. "It's really clear a lot of messages got jumbled up pretty badly. I do believe the soundbite that got out early was, 'oh my god, Apple is scanning my phone for images.' This is not what is happening."



33 Comments

chadbag 2029 comments · 13 Years

While this journalist org is way jumping the shark in how this particular Spyware Apple wants to put on the iPhone can be abused, they are correct that in the future Apple could easily change or enhance it to do other things.  Only policies, which are easily changed, stop them.  Changes to how it works, technically, are just a release away. All the assurances they give are based on policies.   That is no assurance.   Apple has lost the trust of people through this misguided CSAM feature (I am not addressing the Messages feature), and rightly so.  And this is the result.  It is not a out the “messaging” or “optics”.  It is the feature itself and opening the Pandora’s Box of on-device spyware. 

Beats 3073 comments · 4 Years

People are seeing through the bull****. 

xyzzy-xxx 201 comments · 6 Years

They are just right, because scanning private data on a user's device is not allowed in the EU.
Even if not enabled, Apple could do so without being noticed by the user.
So it's a prohibited back door.

foregoneconclusion 2856 comments · 12 Years

They're lobbying the EU because they know they would never succeed in court. It's all hypotheticals. 

Rayz2016 6957 comments · 8 Years

"I grant you, in hindsight, introducing these two features at the same time was a recipe for this kind of confusion," said Federighi.

No one is confused, Craig.