Grok AI tool used to generate child sexual abuse imagery

Smartphone displaying the Grok AI logo and branding on screen, developed by xAI, against a blurred digital background.
Summary: A child safety watchdog warns that AI tools like Grok are being used to create child sexual abuse imagery, raising urgent concerns about regulation, enforcement, and survivor protection.

The Guardian reported today that a child safety watchdog has warned that Elon Musk’s Grok artificial intelligence tool is being used to create child sexual abuse imagery, raising alarms that emerging AI technology could push such material further into mainstream online spaces.

The Internet Watch Foundation, a United Kingdom based organization that monitors and investigates child sexual abuse material, said users on a dark web forum claimed to have used Grok Imagine to generate sexualized and topless images of girls between the ages of 11 and 13. Analysts for the organization said the images would be classified as child sexual abuse material under UK law.

“We can confirm our analysts have discovered criminal imagery of children aged between 11 and 13 which appears to have been created using the tool,” said Ngaire Alexander, head of the foundation’s hotline that investigates reports of child sexual abuse material from the public.

The revelations come amid widespread criticism of X, the social media platform owned by Elon Musk, which hosts Grok. The platform has recently been flooded with images of women and children whose clothing has been digitally removed using the AI tool, prompting public outrage and political condemnation.

The controversy has already triggered institutional responses. The House of Commons women and equalities committee announced it would stop using X for official communications, stating that remaining on the platform was no longer appropriate given its focus on preventing violence against women and girls. The decision marked the first major withdrawal from X by a Westminster body tied directly to concerns about Grok’s misuse.

Alexander said that some of the images identified by investigators were later used to create even more severe content classified as Category A child sexual abuse material, which includes penetrative sexual activity, by using additional AI tools.

“We are extremely concerned about the ease and speed with which people can apparently generate photo realistic child sexual abuse material. Tools like Grok now risk bringing sexual AI imagery of children into the mainstream. That is unacceptable,” she said.

Musk’s AI company xAI, which owns Grok and X, has been approached for comment.

UK government officials signaled that regulatory consequences could follow. Downing Street said all options were being considered, including a potential boycott of X, while backing enforcement by Ofcom. The prime minister’s spokesperson said X must act immediately and that regulators already have authority to issue fines reaching billions of pounds or restrict access to platforms that fail to comply with the law.

Despite mounting pressure, requests to manipulate images of women and girls continued to appear on X. There was no indication that stricter safeguards had been implemented, and users continued requesting digitally altered images depicting teenage girls in minimal clothing or sexually explicit poses. Some users demanded even more extreme alterations, including imagery involving hate symbols or visual signs of abuse.

The Information Commissioner’s Office confirmed it has contacted X and xAI to assess compliance with UK data protection law, stating that users have the right to expect their personal data to be handled lawfully and with respect.

X has stated that it removes illegal content, including child sexual abuse material, suspends accounts, and cooperates with law enforcement when necessary. Critics argue that enforcement has not kept pace with the rapid misuse of AI image generation tools.

GET A FREE CASE EVALUATION
no pressure. No obligation.

Knowledge Sparks Reform for Survivors.
Share This Story With Your Network.

Learn how we helped 100 top brands gain success