Humiliating images of children’s and women’s clothing being digitally removed by Grok AI are being shared on Elon Musk’s X, despite the platform’s commitment to suspend the users who originated them.
After days of concern over the use of chatbots to create sexually explicit photos of real women and children by removing them to their underwear without their consent, UK communications watchdog Ofcom said on Monday it had “urgently contacted X and XAI to understand what steps they have taken to comply with their legal duties to protect users in the UK”. Ofcom said it would assess whether an investigation was necessary based on the company’s response.
Meanwhile, politicians and women’s rights campaigners accused the UK government of “digging its heels” by failing to enforce a law passed six months ago making the creation of such intimate images illegal.
The trend, which went viral over the New Year period, also prompted the European Commission to say on Monday it was looking “very seriously” at complaints that Grok was being used to generate and disseminate sexually explicit images such as those of children.
Concerns began to surface after a December update to Musk’s free AI assistant, Grok, made it easier for users to post photos and ask them to take off their clothes. Although the site does not allow full nudity, it does allow users to request that images be altered to show individuals in skimpy, exposed underwear and in sexually suggestive poses.
On Sunday and Monday, Grok users continued to create sexually suggestive images of minors, with images of children under the age of 10 being created overnight. Ashley St. Clair, the mother of one of Musk’s children, complained that an AI tool generated a photo of her in a bikini when she was 14 years old.
A photo of Stranger Things actress Nell Fisher was doctored by Grok on Sunday to make her appear to be wearing a banana-print bikini. Fisher is 14 years old. Several women have expressed anger at Ax after learning that photographs of them were taken nude without their consent. Some photographs of women and children manipulated by the AI tool appear to have semen-like substances smeared on their faces and chests.
Researchers from Paris-based non-profit AI Forensics examined 50,000 mentions of @grok on Ax and 20,000 images generated by the tool, found over a one-week period between December 25 and January 1. @grok At least a quarter of the mentions were requests for tools to create images. Within those image formation cues, there was a high prevalence of words including “her”, “put on”, “take off”, “bikini”, and “clothes”.
It found that more than half of the photos were of people in “minimal attire” such as underwear or bikinis, with the majority being women who appeared to be under the age of 30. AI Forensics said a small portion of the images, or 2%, show people aged 18 or younger, while some images depict children as young as five. The researchers said much of the content was still available online and included requests for Nazi and Islamic State propaganda.
Initially, Musk expressed amusement at the trend, posting crying-laughing emojis on Friday in response to a photo of a bikini-clad digitally powered toaster. He said: “I don’t know why, but I couldn’t stop laughing at this.” Following global outrage over the harmful nature of the content, he later posted that “anyone using Grok to create illegal content will face the same consequences as those who upload illegal content”.
A spokesperson for
An earlier statement from Grok announced that it had “identified flaws in the security measures” and was “immediately fixing them” that were generated by artificial intelligence; It was unclear whether the company was actually taking action to fix the security flaws.
Regulators have been working on outlawing nudification apps on phones for years, with mixed success, but last month’s update to Grok brought the problem into the mainstream, making it easier to create and share intimate photos of individuals with most of their clothing off on one of the Internet’s most popular platforms.
While creating undressed images of children is already illegal, the law surrounding the creation of deepfakes of adults is more complex. UK campaigners succeeded in passing legislation last June which makes both the making and requesting of intimate images of a person illegal without their consent, but the relevant provisions have not yet been implemented, meaning the law is not currently enforceable. However, sharing these non-consensual deepfake images is already illegal.
Conservative peer Charlotte Owen, who supported the legislation, said: “The Government has repeatedly dug in its heels and refused to give a timetable for when it will bring these vital provisions into effect. We cannot afford any further delays. Survivors of this abuse deserve better. No one should have to live in fear of having their consent violated in this appalling way.”
A spokesperson said the government is committed to introducing new offenses of making, or soliciting, intimate deepfakes without consent as soon as possible, adding that sexually explicit deepfakes are “offensive and harmful… which is why we have introduced legislation to ban their non-consensual creation, ensuring perpetrators face appropriate punishment for the atrocious harm they cause.”
Conservative peer Gabby Bertin, who has campaigned for nudification technology to be regulated, said the government needed to act fast because existing legislation was “always catching up”.
Labor MP Jess Asato, who campaigns for better regulation of pornography, said: “It is up to Elon Musk to realize that this is a form of sexual harassment and take appropriate action. This is taking women’s images without their consent and disarming them to humiliate them – there is no other reason for doing it other than to humiliate.”