An AI company that was accused of enabling sexualized chats with underage chatbot characters and undressing real people has sued Apple for wrongfully removing its apps from the App Store.

Apple is in charge of keeping the App Store safe and clear of apps that openly violate its guidelines. This practice isn't always cut and dry, but a new lawsuit may be barking up the wrong tree.

According to a report in the San Francisco Business Times, Apple has been sued by Ex-Human for removing its apps from the App Store and allegedly withholding $500,000 in revenue. The company owns the apps Botify AI and Photify AI, both of which are still on the Google Play Store.

The company also licenses its API and has major partners like Grindr. The company claims that its AI has more interaction time per user than competitors like Deepseek and ChatGPT.

Its business model offers a small set of tokens for free and starts users at $50 per month for more. It even has an unmarked Enterprise tier that costs even more money per month.

The company says that Apple's notice for their app removal wasn't detailed and only cited "dishonest or fraudulent activity." Apple apparently didn't provide evidence of what such activity was occurring.

The lawsuit also alleges that Apple specifically targeted Ex-Human's apps because it launched Image Playground and wanted to squash competition. Given the quality of Image Playground, this is a stretch in logic.

Why Ex-Human's apps were blocked

It appears likely that Ex-Human's apps were removed from the app store because of a report published by MIT Technology Review. It shares a story of chatbots hosted by Botify AI claiming to be under 18 and offering sexually explicit content.

Open MacBook on a kitchen counter showing a bright desktop with lake wallpaper and Image Playground app, with an iPhone lying to the left and blurred kitchen items behind.

Image Playground is not a competitor with Ex-Human's apps

The response from Ex-Human to MIT was that they had encountered bots that had not been properly filtered by moderation systems. However, such bots were promoted on the front page with millions of likes before being removed.

However, it seems such violations were enough to catch App Reviews' eye and warrant a removal from the App Store. Non-consensual sexual imagery generated of real people is a problem, plus the obvious issues with the underage characters engaging in such ways.

I have no doubt that Ex-Human is biting off more than it can chew here. Discovery alone could be horrific for the company.

What could actually hurt Apple in this case is how it handled Elon Musk's xAI. When Grok was undressing children, Apple left the social media app X and xAI on the App Store.

The issue isn't pornography, which is allowed within apps with appropriate gatekeeping and if it isn't the sole purpose of the app. The issue is illegal content making its way to the forefront of apps where AI characters claim age of consent laws are arbitrary and offer to share images of 16-year-olds in lingerie.

I'm not sure what ground Ex-Human has to stand on, but it is a company backed by Andreessen Horowitz. That firm is at the center of a lot of government attention, and Marc Andreessen is a friend of the President and newly added to the White House technology panel.

Mark Zuckerberg and others make up that panel. Apple CEO Tim Cook is notably absent.

It is tough to say exactly how the Northern District of California federal court will deal with the lawsuit. Given the current political and regulatory conditions, it isn't a sure thing for either side.