Payment processors were against CSAM until Grok started building it

by
0 comments
Payment processors were against CSAM until Grok started building it

For many years, credit card companies and other payment methods were aggressive about monitoring child sexual abuse material. Then, Elon Musk’s Grok started dropping baby clothes on X.

The Center for Countering Digital Hate found 101 sexually explicit images of children as part of its Sample of 20,000 images created by Grok From 29 December to 8 January. Using that sample, the group estimated that 23,000 sexually explicit images of children were produced in that time frame. Over that 11-day period, they estimated that, on average, an erotic image of a child was created every 41 seconds. Not all of the sexually explicit images produced by Grok appear to be illegal, but reports indicate that at least some possibly cross the line.

There is tremendous confusion about what is true on Grok at any given time. Grok has offered responses with misleading details, claiming at one point, for example, that it has restricted image creation to paying X customers, while still allowing free users direct access to X. Although Musk has claimed that the new guardrail will prevent Grok from taking off people’s clothes, our testing showed that this isn’t necessarily true. Using a free account on Grok, The Verge After the new rules reportedly took effect, it was able to generate deeply fake images of real people in scantily clad, sexually suggestive situations. At the time of this writing, some serious signals appear to be blocked, but people are remarkably clever at avoiding rules-based restrictions.

In the past, payment providers have been aggressive about cutting off access to websites with a significant presence of CSAM

However, it seems that You can purchase an X subscription using your credit card at Stripe or through the Apple and Google App Stores. Musk has also suggested Through his post He doesn’t think it’s a problem to take people’s clothes off. This isn’t

In the past, payment providers have been aggressive about cutting off access to websites thought to have a significant presence of CSAM – or even legal, consensually produced sexual content. In 2020, MasterCard and Visa banned Pornhub after A new York Times The article mentions the prevalence of CSAM on the platform. In May 2025, Civitai was Disconnected by its credit card processor Because “they don’t want to support platforms that allow AI-generated explicit content,” Civity CEO Justin Mair explained. 404 media. In July 2025, Payment processors pressure Valve to remove adult games.

In fact, there have been times when financial institutions have threatened people and platforms because it seemed as if they did not want the reputational risk. In 2014, adult performer Eden Alexander’s fundraiser for a hospital stay was shut down by payments company WePay due to a retweet. Also in 2014, JPMorgan Chase suddenly Bank accounts of many porn stars closed. In 2021, OnlyFans briefly tried to ban sexually explicit content because Banks didn’t like it. (The move sparked widespread reaction Soon OnlyFans was turned upside down.) it is legal, Consensual Sexual content – ​​and it was deemed too hot to handle.

“The industry is no longer willing to self-regulate for something that is universally agreed upon because it is the most disgusting thing.”

But Musk’s boutique revenge porn and CSAM generator is, apparently, just fine.

This is a surprising reversal. “The industry is no longer willing to self-regulate for something that is universally agreed upon, which is the most despicable thing,” which is CSAM, says author Lana Swartz. The New Money: How Payments Became Social MediaDue to inaction by Stripe and credit card companies.

Visa, MasterCard, American Express, Stripe and Discover did not respond to requests for comment. The American Financial Coalition Against Child Sexual Exploitation — an industry group made up of payment processors, banks and credit card companies — also did not return a request for comment. On its website, FCACS claims that “as a result of its efforts, the use of credit cards to purchase child sexual exploitation material online has been virtually eliminated globally.”

Sexual images of children are not the only problem with Grok’s image making

“In the past, people who did completely legal work were cut off from banks,” says Riana Pfefferkorn, a policy fellow at the Stanford Institute for Human-Centered Artificial Intelligence. There are incentives to heavily enforce limits around questionable images – and traditionally, that’s what the financial industry has done. Then why is X different? It is run by Elon Musk. “He’s the richest man in the world, he has close ties to the U.S. government, and he’s an incredibly litigious person,” says Pfefferkorn. In fact, Musk has previously filed a lawsuit against the Center for Countering Digital Hate; In the now-dismissed lawsuit, they claimed they illegally collected data that showed an increase in hate speech after purchasing the platform formerly known as Twitter.

Sexualized images of children are not the only problem with Grok’s image making. the new York Times It is estimated that 1.8 million images generated by AI over a nine-day time period, or about 44 percent of posts, were sexually explicit images of adult women – which, depending on how explicit they are, may even be illegal to spread. Using a variety of tools, the Center for Countering Digital Hate estimated that more than half of Grok’s images included sexualized images of men, women, and children.

The explosion of erotic images occurred when Musk posted an AI-edited image of herself in a bikini on December 31. a week later, Posted by Nikita Bear, Head of Product at X The last four days were also the highest engagement days ever on X.

Lawyer Carrie Goldberg, whose history includes challenging Section 230 stalking lawsuit against grindr And the other one finally suits him Close chat client OmegleAshley St. Clair, the mother of one of Musk’s children, is being represented in a case against Ax. St. Clair is one of several Groc women — and now she’s suing the platform, arguing that the exes created a public nuisance. “In the St. Clair case we are focusing only on XAI and Grok because they are, from our perspective, directly liable,” he said in an email. “But I can imagine other sources of liability.” He specifically cited distributors such as Apple and Google’s App Store as areas of interest.

“A lot of this could go to court, and it will be up to judges to decide about what is ‘sexually explicit’.”

There are other potential legal implications as well. In 2022, Visa was sued for providing payment services to Pornhub, because Visa allegedly knew that Pornhub was not adequately moderating CSAM. Other lawsuits followed. While the judge in the Visa case rejected the claim that Pornhub was not liable due to Section 230, he also said Claims against Visa temporarily dismissed in 2025However, the woman who filed the suit Can file amended complaint.

“A lot of this can go to court, and it’s up to judges to make decisions about what is ‘sexually explicit,'” says David Evan Harris, a public scholar at the University of California, Berkeley. still, 45 states have criminalized AI-generated CSAM. Federal Take It Down Act Deepfake nudes are criminalized. Is in the state of California Musk and Ax issued a cease and desistFollowing the announcement of an investigation into Grok’s images. may grok Violating California’s deepfake porn ban – and California is at least one of 23 states that have passed such laws.

This should matter to payment processors, because if they are knowingly transferring money that is the proceeds of a crime, they are engaged in money laundering – which can have serious consequences. California Attorney General Rob Bonta’s office declined to comment on whether Stripe, the credit card or the App Store were also part of the Groke investigation, citing the ongoing investigation. Money laundering laws are part of the reason why financial institutions charge any website accused of containing CSAM.

But X has created a situation where payment processors are highly discouraged from taking the law seriously. This is because any state that files a lawsuit against processors over X is likely to be attacked by Musk for “censoring” X’s right-wing base. Furthermore, Musk – and possibly his friend, US President Donald Trump – could put a lot of resources behind getting payment processors off the hook.

It seems that when it comes to CSAM and deepfakes, the financial industry is no longer willing to regulate itself. Then who will regulate it?

Follow topics and authors To see more like this in your personalized homepage feed and get email updates from this story.


Related Articles

Leave a Comment