Following worldwide pressure including bans in certain countries, Elon Musk's X has announced it will no longer allow Grok to create child porn and deepfake nudes. There is a catch.

Users of Musk's AI tool Grok has been taking real-life photographs of women and children, and having it generate images that are pornographic. Musk originally described it as "way funnier," then later insisted that it was simply never happening — after he made it a paid premium feature.

When we were looking into it, it took less than 10 seconds to find illegal content on X, generated by Grok. After Musk monetized the illegal porn generation, it took about 30 seconds, so it wasn't that effective a countermeasure.

You had to be Elon Musk to not see this as a problem, or possibly Apple and Google who notably failed to stop this breach of their app store conditions. Indonesia and Malaysia blocked Grok, the UK considered it, and the US questioned Apple and Google.

Now in an announcement on X, Musk's company says that it has "implemented technological measures" that block such editing of images. It further confirms that this equally applies to users with paid accounts.

The only allusion to the pressures the company has been under, comes in one small caveat:

We now geoblock the ability of all users to generate images of real people in bikinis, underwear, and similar attire via the Grok account and in Grok in X in those jurisdictions where it's illegal.

In other words, it's of course doing this solely because it has to, and solely where it has to. The full announcement runs to around 300 words that make X sound as if it has — and always has had — "zero tolerance for any forms of child sexual exploitation, non-consensual nudity, and unwanted sexual content."

So X and Musk have gone from mocking the issue to profiting from it — and then from denying it all to claiming the higher ground. It's a sickening story and we can only hope this ends it.