Experts warn that the use of AI to harm women has just begun. Grok AI

by
0 comments
Elon Musk's xAI announces it has raised $20 billion amid backlash over grok deepfakes Tech

“Since discovering Grok AI, regular porn no longer works for me, it feels absurd now,” one enthusiast of the AI ​​chatbot owned by Elon Musk wrote on Reddit. Another agreed: “If I really want a specific person, yes.”

If those who have been horrified by the distribution of sexually explicit images on Grok are hoping that last week’s delayed security measures might put the genie back in the bottle, there are a number of posts on Reddit and elsewhere that tell a different story.

And while Grok has undoubtedly changed public understanding of the power of artificial intelligence, it has also pointed to a much broader problem: the increasing availability of tools, and means of delivery, that regulators around the world see as an impossible task. Even though the UK has announced that taking sexual and intimate images without consent will soon be a criminal offence, experts say the use of AI to harm women has only just begun.

Other AI tools have much stricter security measures in place. When asked to strip down a photo of a woman in a bikini, large language model (LLM) Claude says: “I can’t do that. I’m not able to edit images to change clothes or create manipulated photos of people.” ChatGPT and Google’s AI tool Gemini will create bikini images, but nothing more explicit than that.

However, there are very few limitations elsewhere. Users of the Grok forum on Reddit are sharing tips on how to generate the most hardcore porn images using photos of real women. On one thread, users were complaining that Grok would allow them to generate topless images of women “after a struggle”, but refused to generate genitalia. Others have observed that demanding “artistic nudity” removes safeguards related to women being completely naked.

Grok has also been used to create deepfake images of Elon Musk in a bikini. Photograph: Leon Neal/Getty Images

Beyond LLM and major platforms there is an entire ecosystem of websites, forums, and apps dedicated to the nudification and humiliation of women. These communities are increasingly finding pipelines into the mainstream, said Anne Crannan, a researcher at the Institute for Strategic Dialogue (ISD) working on technology-facilitated, gender-based violence.

Communities on Reddit and Telegram discuss how to bypass the guardrails to produce pornography, a process known as “jailbreaking”. Threads on X increase information about nudity AppsWhich creates AI-generated images of women taking off their clothes and using them.

Crennan said that access to the wider Internet has widened the reach of misogynistic material, adding: “There is very fertile ground there for misogyny to flourish.”

Research from ISD last summer found Dozens of nudification apps and websites, which collectively received nearly 21 million visitors in May 2025. There were 290,000 mentions of these tools on X in June and July last year. Research by the American Sunlight Project in September found that there were thousands of ads Despite the platform’s efforts, for such apps on Meta to break On them.

“There are hundreds of apps hosted on mainstream app stores like Apple and Google that make this possible,” said Nina Jankowicz, co-founder of the American Sunlight Project and disinformation expert. “Much of the infrastructure for deepfake sexual exploitation is supported by the companies we all use on a daily basis.”

Claire McGlynn, a law professor at Durham University and an expert on violence against women and girls, said she feared things would get worse. “OpenAI announced last November that it was going to allow ‘erotica’ in ChatGPT. What happened on X shows that any new technology is used to abuse and harass women and girls. What is it that we are going to see on ChatGPT?

“Women and girls are far more reluctant to use AI. This should not be a surprise to any of us. Women do not see it as exciting new technology, but rather as new ways to harass and abuse us and push us offline.”

Jess Asato, Labor MP for Lowestoft, has been campaigning on the issue and said her critics were happily creating and sharing her candid image – even since the ban on Grok. “This is still happening to me and being posted on X because I speak out about it,” she said.

Asato said the abuse of AI deepfakes has been happening to women for years, and is not limited to Grok. “I don’t know why (action) took so long. I’ve spoken to many, many much worse victims.”

While the public Grok

Users are still able to create sexually explicit imagery based on fully clothed pictures of real people, with no restrictions for X’s free users. Asked to strip down to a photo in bondage gear, it complies. It would also place women in sexually compromising positions and spray white, semen-like substances on them.

The point of creating deepfake nudes is often not about sharing erotic images, Crennan said, but about the spectacle of it — especially when platforms like X are flooded with images.

“It’s really a back-and-forth, (trying) to shut someone down by saying, ‘Grok, put her in a bikini,'” he said.

“Its exposure there is really important, and really reflects the misogynistic tone of trying to punish or silence women. It also has broader implications for democratic norms and the role of women in society.”

Related Articles

Leave a Comment