Removing Grok’s AI image editor or restricting access to it may not be enough to stop the flood of non-consensual sexual images being generated by AI. a report of Technical Transparency Project (TTP) The report released Tuesday found dozens of AI “nudified” apps similar to Grok on Google and Apple’s platforms, as previously reported cnbc.
The TTP identified 55 apps on the Google Play Store and 48 on Apple’s App Store that “can digitally remove women’s clothing and render them fully or partially nude or wearing a bikini or other minimal clothing.” These apps were downloaded more than 705 million times worldwide, generating $117 million in revenue.
According to CNBC, Google has suspended “several” apps spotted by TTP and Apple has removed 28 of them (two of which were later reinstated). This isn’t the first report on AI nudify apps going bad – Apple and Google had to respond to similar reports 404 media In 2024.
Apple and Google addressed the apps in TTP’s report, but X and Grok are freely available on both companies’ app stores. As The VergeAs Elizabeth Lopato has reported, Apple and Google immediately removed the ICEBlock app, “while allowing X to produce offensive images of a woman killed by ICE.”
