Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple expands feature that blurs iMessage nudity to UK, Canada, New Zealand, and Australia

Apple iMessage

Last updated

Apple is rolling out the communication safety feature that scans iMessages for nudity on devices owned by younger users in the U.K., Canada, New Zealand, and Australia months after debuting it in the U.S.

The feature, which is different from the controversial on-device Photos evaluation function, will automatically blur potentially harmful images in received or outgoing messages on devices owned by children.

First reported by The Guardian, Apple is expanding the feature to the U.K. after rolling it out in the U.S. back in iOS 15.2. AppleInsider has learned that the feature is expanding to Canada, New Zealand, and Australia as well.

How the feature works depends on whether a child receives or sends an image with nudity. Received images will be blurred, and the child will be provided with safety resources from child safety groups. Nudity in photos sent by younger users will trigger a warning advising that the image should not be sent.

The feature is privacy-focused and is available only on an opt-in basis. It must be enabled by parents. All detection of nudity is done on-device, meaning that any potentially sensitive images never leaves an iPhone.

Apple first announced the Communication Safety function alongside a suite of features meant to provide better safety mechanisms aimed at children. That suite included a system that scanned Photos for child sex abuse material (CSAM).

The CSAM scanning had several privacy mechanisms and never looked through a user's images. Instead matched potential abusive material based on known hashes provided by child safety organizations. Despite that, Apple faced a backlash and delayed the feature until further notice.

Apple's Communication Safety function is completely unrelated to the CSAM scanning mechanism. It first debuted in the U.S. in November as part of iOS 15.2, and its expansion to these regions signals its rollout to other markets.



23 Comments

dutchlord 279 comments · 7 Years

I am against this device based scanning by Apple. Its not of their business. This is called privacy Cook and you are not going to mess with it. 

crowley 10431 comments · 15 Years

dutchlord said:
I am against this device based scanning by Apple. Its not of their business. This is called privacy Cook and you are not going to mess with it. 

So don't opt in.

person 34 comments · 11 Years

crowley said:
dutchlord said:
I am against this device based scanning by Apple. Its not of their business. This is called privacy Cook and you are not going to mess with it. 
So don't opt in.

Nobody was given a choice. It’s just a precursor of all the anti privacy policies and features to come

crowley 10431 comments · 15 Years

person said:
crowley said:
dutchlord said:
I am against this device based scanning by Apple. Its not of their business. This is called privacy Cook and you are not going to mess with it. 
So don't opt in.
Nobody was given a choice. It’s just a precursor of all the anti privacy policies and features to come

What do you mean?  You have a choice, the feature is opt in, it clearly says so in the article.

The feature is privacy-focused and is available only on an opt-in basis. It must be enabled by parents. 

entropys 4316 comments · 13 Years

Opt in or out is not the same as a company creating the ability to do so. iMessage main thing, apart from the cool blue boxes,is its privacy.  This is helping to wreck it.

on the particular matter, my first thought was how many kids actually have iPhones anyway? But then I am an old fogey from The Time Before Mobiles. Every kid has phones. And privacy, once important, and protection against creation of a Big Brother in all its possible forms, no longer matters.