Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple finally pulls generative AI nude apps from the App Store

Apple has removed apps from the App Store that claimed to make nonconsensual nude imagery, a move that demonstrates Apple is now more willing to tackle the hazardous app category.

The capabilities of generative AI to create images based on prompts has become a very useful tool in photography and design. However, the technology also has been misused in the creation of deep fakes — and nonconsensual pornography.

Despite the danger, Apple has been remarkably hands-off from the problem. Prior to the recent move, it hadn't done much to fix a potentially major problem.

In a report by 404 Media, Apple was informed of a number of AI image generation apps available in the App Store. Specifically, they were apps that were marketed as able to create nonconsensual nude images.

The apps offered features such as face-swaps on adult images. Others were marketed as "Undress" apps, virtually stripping the clothing off of subjects of otherwise innocuous photos.

After being alerted to the apps and related advertising, Apple removed three of them from the App Store. Google similarly removed apps from the Play Store.

The report's investigation previously raised the issue that Instagram advertises the apps through Meta's Ad Library, Once the ads were flagged, Meta deleted them.

A slow start

Apple's removal of the apps from the App Store is good news, but with some lingering issues. For a start, Apple didn't manage to ban the apps as part of its App Store Review process, but instead had to be alerted by third parties to their existence.

Even so, it is a step up from previous attempts by the publication to combat the apps.

Reports in 2022 about the apps explained that they initially appeared innocent on the App Store pages. However, the deep fake porn capabilities were advertised on porn sites.

At the time, Apple and Google were alerted to them, but declined to remove the apps. Instead, the companies were told to stop running the ads on the adult sites, and the apps were allowed to persist in the App Store

Despite the order, one of the apps continued marketing its adult features until 2024, when it was pulled from the Google Play Store.

Apple's decision to finally remove the offending apps from the App Store is the latest move Apple made to try and keep its AI dealings as above board as possible.

As well as using training methods for AI language models that preserve privacy, Apple has also avoided using copyrighted works in an illegal way. While Microsoft and OpenAI have been sued by the New York Times for copyright infringement from using its articles for AI training, Apple has instead tried to license works from major publishers in exchange for millions of dollars.



4 Comments

Massiveattack87 102 comments · New User

I ask myself if AAPL is gonna ever allow such apps when their goggle is ready for the mass market. 
The success of their goggle depends on nude and porn. 

mknelson 1148 comments · 9 Years

I ask myself if AAPL is gonna ever allow such apps when their goggle is ready for the mass market. 
The success of their goggle depends on nude and porn. 

I think you've missed the point - it's not regular nude and porn. This is face swapping/body swapping technology. Nonconsenusal use is flat out illegal in several jurisdictions.

jellyapple 116 comments · 1 Year

Finally? TikTok is still there. Com’on.