Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple rolling out nudity-blurring child safety feature to more countries

Apple is expanding where the iOS Communications Safety feature will be available, with it spreading into six more countries.

Communications Safety is a part of iMessage that examines inbound and outbound messages for nudity on devices owned by children. After an initial rollout in the U.S. in iOS 15.2, and an expansion in 2022, the feature's now coming to another six countries.

Along with the United States, Communications Safety is now available in the U.K. Canada, New Zealand, and Australia.

According to iCulture, Communications Safety will be heading out to the Netherlands, Belgium, Sweden, Japan, South Korea, and Brazil. In the case of the Netherlands and Belgium, it will be rolling out in the coming weeks.

Originally introduced alongside the controversial and canned on-device Photos evaluation function, Communications Safety tries to detect if a child is sending or receiving images that could contain nudity. Received images will be blurred, with the young user provided links to safety resources from child safety groups.

If an image containing nudity is about to be sent, the same system will offer another warning, advising that the image should not be sent at all.

The feature is designed to be privacy-focused, offered as an opt-in feature with all detection handled on-device so the data never leaves the iPhone.

The expansion of the feature arrives two weeks after the 20th annual Safer Internet Day, an European initiative that saw Apple promote privacy features and free educational resources on how to stay safe online.



9 Comments

oldenboom 8 Years · 34 comments

I'm from the Netherlands. I don't really care about my child seeing any nudity (she'll immediately yell "Yuck!" anyway). However I do care about child abuse, blackmail between kids, bullying and pornography  (especially abusive, agressive or disrespectful pornography). I do have my doubts regarding American companies trying to enforce American puritan morals upon my child. I hate to see my child already feels nudity is bad, just because of the leading puritan American morals in a lot of series and movies while at the same time agression and gore is apparantly quite accepted in the US while it's not on this side of the Atlantic. So Apple now decides she cannot see nudity but she can see violence? Yes, I would prefer my kid not to send any nude pictures and to stay away from Omegle etc. But that's way more my task as a parent, not Apple's.

Anyway, I just read the iCulture article. It seems upon receival of a suspected nude picture, Apple will just blur the photo and warn the child with "This photo could be sensitive. Are you sure you want to view it?". And upon sending a possible nude picture "It's your choice but make sure you feel safe". Now, teens are per definition very curious and my kid is no exception: she'd definately want to view such a picture but will learn pretty fast not to view any more such pictures from certain people in a chatgroup. So, it's not as bad as I initially thought it was. It might even be quite helpful but still, the message it bears is "nudity is bad" and that's something I definately don't want my child to learn.

appleinsideruser 5 Years · 663 comments

oldenboom said:
I'm from the Netherlands. I don't really care about my child seeing any nudity (she'll immediately yell "Yuck!" anyway). However I do care about child abuse, blackmail between kids, bullying and pornography  (especially abusive, agressive or disrespectful pornography). I do have my doubts regarding American companies trying to enforce American puritan morals upon my child. I hate to see my child already feels nudity is bad, just because of the leading puritan American morals in a lot of series and movies while at the same time agression and gore is apparantly quite accepted in the US while it's not on this side of the Atlantic. So Apple now decides she cannot see nudity but she can see violence? Yes, I would prefer my kid not to send any nude pictures and to stay away from Omegle etc. But that's way more my task as a parent, not Apple's.

Anyway, I just read the iCulture article. It seems upon receival of a suspected nude picture, Apple will just blur the photo and warn the child with "This photo could be sensitive. Are you sure you want to view it?". And upon sending a possible nude picture "It's your choice but make sure you feel safe". Now, teens are per definition very curious and my kid is no exception: she'd definately want to view such a picture but will learn pretty fast not to view any more such pictures from certain people in a chatgroup. So, it's not as bad as I initially thought it was. It might even be quite helpful but still, the message it bears is "nudity is bad" and that's something I definately don't want my child to learn.

Yup, I agree. Americans think it’s fine to show people being violently brutalised. But a nipple is a step too far. Seems odd on this side of the pond!

Rogue01 3 Years · 196 comments

This goes back to when teens would watch scrambled Cinemax or HBO late at night hoping to catch a glimpse of a boob!  Parents should be parents and monitor their children's activities, rather than Apple playing police man or second parent.  Like what Oldenboom said.  In some countries, nudity is way more normal than in the US, and we weren't born dressed.  Of course child porn, extortion, and bullying is horrible, but it seems Apple wants to block everything that they deem inappropriate.  What if it is an image of famous artwork that contains nudity?  Apple will block it claiming it might be unsafe to view, when it is a piece of art.

mjtomlin 20 Years · 2690 comments

Rogue01 said:
This goes back to when teens would watch scrambled Cinemax or HBO late at night hoping to catch a glimpse of a boob!  Parents should be parents and monitor their children's activities, rather than Apple playing police man or second parent.  Like what Oldenboom said.  In some countries, nudity is way more normal than in the US, and we weren't born dressed.  Of course child porn, extortion, and bullying is horrible, but it seems Apple wants to block everything that they deem inappropriate.  What if it is an image of famous artwork that contains nudity?  Apple will block it claiming it might be unsafe to view, when it is a piece of art.

This a feature that parents can turn on in iMessages if they feel the need to. This is not Apple imposing their morals on you.

This is about a child having an unsupervised conversation with someone in iMessage on their iPad or iPhone who is sending them nude photos. Chances are those aren't photos of art. While you may not care that predator is grooming your child, most parents would have a problem with it.

Anilu_777 8 Years · 579 comments

Of course the problem that isn’t addressed is sexual abuse within the home. In those cases nudity is the “norm” for the wrong reasons.