Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

New iOS 15.2 beta includes Messages feature that detects nudity sent to kids

Credit: Andrew O'Hara, AppleInsider

Last updated

Apple's latest iOS 15.2 beta has introduced a previously announced opt-in communication safety feature designed to warn children — and not parents — when they send or receive photos that contain nudity.

The Messages feature was one part of a suite of child safety initiatives announced back in August. Importantly, the iMessage feature is not the controversial system designed to detect child sexual abuse material (CSAM) in iCloud.

Although not enabled by default, parents or guardians can switch on the Messages feature for child accounts on a Family Sharing plan. The feature will detect nudity in Messages, blur the image, and warn the child.

Apple says children will be given helpful resources and reassured that it's okay if they don't want to view the image. If a child attempts to send photos that contain nudity, similar protections will kick in. In either case, the child will be given the option to message someone they trust for help.

Unlike the previously planned version of the feature, parents will not be notified if the system detects a message contains nudity. Apple says this is because of concerns that a parental notification could present a risk for a child, including the threat of physical violence or abuse.

Apple says that the detection of nudity flag will never leave a device, and the system doesn't encroach upon the end-to-end encryption of iMessages.

It's important to note that the feature is opt-in and only being included in the beta version of iOS, meaning it is not currently public-facing. There's no timeline on when it could reach a final iOS update, and there's a chance that the feature could be pulled from the final release before that happens.

Again, this feature is not the controversial CSAM detection system that Apple announced and then delayed. Back in September, Apple said it debut the CSAM detection feature later in 2021. Now, the company says it is taking additional time to collect input and make necessary improvements.

There's no indication of when the CSAM system will debut. However, Apple did say that it will be providing additional guidance to children and parents in Siri and Search. In an update to iOS 15 and other operating systems later in 2021, the company will intervene when users perform searches for queries related to child exploitation and explain that the topic is harmful.



19 Comments

JaiOh81 8 Years · 61 comments

I’m legitimately curious, how can iMessage “detect” sexually explicit photos being sent to or from a phone.

Does anyone know how this is being done on device?

elijahg 18 Years · 2842 comments

Although not enabled by default, parents or guardians can switch on the Messages feature for child accounts on a Family Sharing plan. The feature will detect nudity in Messages, blur the image, and warn the child. 

Unlike the previously planned version of the feature, parents will not be notified if the system detects a message contains nudity. Apple says this is because of concerns that a parental notification could present a risk for a child, including the threat of physical violence or abuse.

Surely that's a bit of a catch-22: Since it has to be enabled by parents but won't notify parents due to potential repercussions, I'd wager the kind of parents that would dish out those repercussions would never turn the feature on anyway?

This does still have the potential to be contentious, since it's still scanning (on device) the photos being sent. That means much like before the tech for further erosion of privacy is already implemented, potentially allowing a country to force Apple to scan for particular pictures as they see fit. Yes it's enabled for child accounts only, but it wouldn't be much trouble to enable scanning for anyone, and not much more of a stretch to force them to send a surreptitious notification to a government minion.

What happens if Jingping tells Apple it has to scan for any photos of Winnie the Pooh? Will Apple still say no under threat of being removed from China? This question that was never really answered before still exists - and Apple's only response is "we won't bend to government demands", with no answer to "even if it's China?".

arturo.soldatini 3 Years · 2 comments

elijahg said:

This does still have the potential to be contentious, since it's still scanning (on device) the photos being sent. That means much like before the tech for further erosion of privacy is already implemented, potentially allowing a country to force Apple to scan for particular pictures as they see fit. Yes it's enabled for child accounts only, but it wouldn't be much trouble to enable scanning for anyone, and not much more of a stretch to force them to send a surreptitious notification to a government minion.

What happens if Jingping tells Apple it has to scan for any photos of Winnie the Pooh? Will Apple still say no under threat of being removed from China? This question that was never really answered before still exists - and Apple's only response is "we won't bend to government demands", with no answer to "even if it's China?".

Photos library scanning is already happening on our devices. Both Apple and Google offers items and faces recognition in their apps, so it makes no sense to be worried now after years of library scanning. 

darkvader 15 Years · 1146 comments

elijahg said:

Although not enabled by default, parents or guardians can switch on the Messages feature for child accounts on a Family Sharing plan. The feature will detect nudity in Messages, blur the image, and warn the child. 

Unlike the previously planned version of the feature, parents will not be notified if the system detects a message contains nudity. Apple says this is because of concerns that a parental notification could present a risk for a child, including the threat of physical violence or abuse.

Surely that's a bit of a catch-22: Since it has to be enabled by parents but won't notify parents due to potential repercussions, I'd wager the kind of parents that would dish out those repercussions would never turn the feature on anyway?

This does still have the potential to be contentious, since it's still scanning (on device) the photos being sent. That means much like before the tech for further erosion of privacy is already implemented, potentially allowing a country to force Apple to scan for particular pictures as they see fit. Yes it's enabled for child accounts only, but it wouldn't be much trouble to enable scanning for anyone, and not much more of a stretch to force them to send a surreptitious notification to a government minion.

What happens if Jingping tells Apple it has to scan for any photos of Winnie the Pooh? Will Apple still say no under threat of being removed from China? This question that was never really answered before still exists - and Apple's only response is "we won't bend to government demands", with no answer to "even if it's China?".

If this contains no code capable of reporting the results of the image analysis to any third party, then it's fine. 

Unfortunately, we have no way of knowing if that dangerous code is there, and given Apple's previous intent of putting it there, Apple can't be trusted at this point.  Apple needs to have a third-party code audit to confirm that the dangerous code is indeed gone.

auxio 19 Years · 2766 comments

darkvader said:

If this contains no code capable of reporting the results of the image analysis to any third party, then it's fine. 

Unfortunately, we have no way of knowing if that dangerous code is there, and given Apple's previous intent of putting it there, Apple can't be trusted at this point.  Apple needs to have a third-party code audit to confirm that the dangerous code is indeed gone.

The on-device scanning and sending of information could easily be audited by logging everything your phone is sending out via your router (which I'm sure people are already doing).  As for whether information is shared with 3rd parties, that would require an audit on Apple's side, not the on-device code.