Apple is rolling out the communication safety feature that scans iMessages for nudity on devices owned by younger users in the U.K., Canada, New Zealand, and Australia months after debuting it in the U.S.
Apple iMessage
The feature, which is different from the controversial on-device Photos evaluation function, will automatically blur potentially harmful images in received or outgoing messages on devices owned by children.
First reported by The Guardian, Apple is expanding the feature to the U.K. after rolling it out in the U.S. back in iOS 15.2. AppleInsider has learned that the feature is expanding to Canada, New Zealand, and Australia as well.
How the feature works depends on whether a child receives or sends an image with nudity. Received images will be blurred, and the child will be provided with safety resources from child safety groups. Nudity in photos sent by younger users will trigger a warning advising that the image should not be sent.
The feature is privacy-focused and is available only on an opt-in basis. It must be enabled by parents. All detection of nudity is done on-device, meaning that any potentially sensitive images never leaves an iPhone.
Apple first announced the Communication Safety function alongside a suite of features meant to provide better safety mechanisms aimed at children. That suite included a system that scanned Photos for child sex abuse material (CSAM).
The CSAM scanning had several privacy mechanisms and never looked through a user's images. Instead matched potential abusive material based on known hashes provided by child safety organizations. Despite that, Apple faced a backlash and delayed the feature until further notice.
Apple's Communication Safety function is completely unrelated to the CSAM scanning mechanism. It first debuted in the U.S. in November as part of iOS 15.2, and its expansion to these regions signals its rollout to other markets.