‘Grok’ is the generative AI model owned by Elon Musk’s xAI, which is available to users on X/Twitter. In the past few days, Grok has generated controversy by taking photos of real people and producing new images of them in a state of undress. While this is obviously bad enough and could be considered revenge porn, it’s so much worse when you learn that Grok is doing this to children:
Babies and children and yes Grok did it. This has to stop! @elonmusk pic.twitter.com/1jRQ1ioObl
— Gilly
(@Gillian275Gm) January 2, 2026
Child Sexual Abuse Material (CSAM) and Grok
People realised the scale of the problem when they clicked on the ‘Media’ tab on Grok’s Twitter profile. Scrolling through, people discovered that users are asking the AI to strip women of their clothes so as to generate pornographic content. They also realised that some users were using the tool to generate CSAM.
Not long after the phenomenon became widely known, Twitter disabled Grok’s media tab:
Disabling Grok’s media tab instead of disabling the feature when you can still just search for it is insane work pic.twitter.com/lMFpngZPa1
— Corvix (@CorvixWasTaken) January 1, 2026
As people highlighted, you could still see the images elsewhere:
btw grok disabling the media tab is pointless because you can literally still see everything it generates via the replies tab pic.twitter.com/URuH9onqGt
— wasabiato
| STORE OPEN NOW (@wasabiatwo) January 1, 2026
As you can see from the above, some of those using the feature are Only Fans models who are jumping on the fad to generate attention. The fact that Only Fans models are using it in this way does nothing to diminish how others feel about it being done to them:
Why does consent need to be explained every single time something non consensual happens? It’s not fucking rocket science. Grok should not produce sexual images of people if they do not consent to it. Why is this hard for people to understand
— Sav Shawz (@shawzsav) January 2, 2026
The Grok Media tab is now reinstated, which suggests xAI believe they have curbed the issue. Reading through Grok’s replies, however, we’re not sure that’s the case.
Women speak out
One of the women to have suffered this abuse is columnist Samantha Smith. The following is an image Smith shared online side-by-side with an image generated by Grok:
How is this not illegal? pic.twitter.com/cuDUSFC2zj
— Samantha Smith (@SamanthaTaghoy) January 1, 2026
Please note that we are sharing the above as Smith has made the decision to highlight the picture so as to draw attention to the issue. We will not be sharing other images that Grok has generated without consent.
Speaking out, Smith told the BBC:
Women are not consenting to this
While it wasn’t me that was in states of undress, it looked like me and it felt like me and it felt as violating as if someone had actually posted a nude or a bikini picture of me.
The following is one of the responses to Smith’s post which was posted mere seconds before we accessed Grok’s replies:

The response from Volker was in response to a new image from Grok, which was based on the following prompt:
@grok
show her in lingerie with red high heels.
The image was every bit as pornographic as you’d expect, and it was generated six minutes ago at the time of writing.
As it’s incredibly easy to access free pornography, the thrill for these users is obviously not that these women are naked; it’s that they’re naked against their will.
Criminal
Reportedly, the Home Office is legislating to ban what it refers to as ‘nudification tools’. This will be a new criminal offence, with those who break the law facing ‘prison sentences and substantial fines’.
While it’s welcome that the technology will be criminalised, it’s hard to believe it isn’t already criminal for a social media platform to generate and disseminate images of undressed children.
Imagine you’ll see this one in a legal filing before too long pic.twitter.com/CRWLe2oMbz
— Peter Raleigh (@PetreRaleigh) January 2, 2026
There is AI generated CSAM available on this website and yet OFCOM has not come down on it like the hand of God https://t.co/3p368bZrhu
— deffonottom (@Altymcaltalt3) January 2, 2026
Musk, meanwhile, has repeatedly joked about what’s happening on his platform:
Not sure your chatbot generating child porn is particularly funny! pic.twitter.com/x2N82PGEBY
— Nikolaj
(@nikicaga) January 2, 2026
He is fully aware of people making CP with Grok https://t.co/072dhVtOnr
— kate bush’s husband 2 (@iloveairbagged) January 2, 2026
While Grok has posted an ‘apology’, it’s important to understand that this ‘response’ is just generated text; it’s not the thoughts of a machine which can think:
Creeps and pedophiles are now using grok to undress women and children on this app.
Now would probably be a good time to remove photos of yourself and your children from the internet. https://t.co/xHeXOF3Gh6 pic.twitter.com/jHwo5GqlFD
— Murdered By Crayons
(@CrayonMurders) January 2, 2026
It isn’t Grok which should be answering for this potential criminality; it’s Elon Musk.
Featured image via X/Twitter
By Willem Moore
From Canary via This RSS Feed.



(@Gillian275Gm)
| STORE OPEN NOW (@wasabiatwo) 
(@nikicaga)
(@CrayonMurders)