Extremists are using AI voice cloning to promote propaganda. Experts say that this is helping them to grow. Artificial Intelligence (AI)

by
0 comments
Extremists are using AI voice cloning to promote propaganda. Experts say that this is helping them to grow. Artificial Intelligence (AI)

wWhile the artificial intelligence boom is impacting parts of the music industry, voice-generating bots are also becoming a boon for another unlikely corner of the Internet: extremist movements that are using them to recreate the voices and speeches of prominent figures in their environment, and experts say it’s helping them grow.

“The adoption of AI-enabled translation by terrorists and extremists marks a significant evolution in digital propaganda strategies,” said Lucas Weber, senior threat intelligence analyst at Tech Against Terrorism and research fellow at the Soufan Center. Weber specializes in monitoring the online tools of terrorist groups and extremists around the world.

“Earlier methods relied on human translators or rudimentary machine translation, often limited by language fidelity and stylistic nuances,” he said. “Now, with the rise of advanced generative AI tools, these groups are able to produce seamless, contextually accurate translations that preserve tone, emotion, and conceptual intensity across multiple languages.”

Among the neo-Nazi far-right, the adoption of AI-voice cloning software has already become particularly prevalent, with several English-language versions of Adolf Hitler’s speeches garnering millions of streams on X, Instagram, TikTok and other apps.

according to a Recent Research Posts According to the Global Network on Extremism and Technology (GENET), extremist content creators have turned to voice cloning services, notably ElevenLabs, and fed them archival speeches from the Third Reich era, which are then processed to mimic Hitler in English.

Neo-Nazi accelerationists, the types who plot terrorist acts against Western governments in order to provoke social collapse, have also turned to these tools to spread more updated versions of their ultra-violent message. For example, Siege, an insurrection manual written by an American neo-Nazi banned terrorist James Mason, which became the de facto Bible for organizations like the Base and the now-defunct Atomwaffen Division, was turned into an audiobook in late November.

“For the past several months I have been involved in creating an audiobook of Siege by James Mason,” said a prominent neo-Nazi influencer with a heavy presence on X and Telegram, who pieced together the audiobook with the help of an AI tool.

“Using a custom voice model of Mason’s, I recreated each newsletter and most of the accompanying newspaper clippings to look like the original published newsletters.”

The influencer appreciated the power of Mason’s writing from “pre-Internet America” ​​and turning it into a modern voice.

“But hearing the startling accuracy of the predictions made in the early eighties really marks a milestone on the road and has changed my view of our shared purpose at a fundamental level,” he said.

At its peak in 2020, the base organized a book club on the siege, which had a significant impact on many members, who discussed its benefits in a hypothetical war against the US government. A nationwide FBI counterterrorism investigation ultimately implicated more than a dozen of its members on various terrorism-related charges in the same year.

Joshua Fisher-Birch, terrorism analyst at the Counter Extremism Project, said, “Creators of audiobooks have released similar AI content before; however, Siege’s history is more notorious,” due to its cult-like status among some of the online extreme right, its promotion of lone actor violence, and its being required reading by many neo-Nazi groups that openly support terrorism and whose members have committed violent criminal acts.

Weber says that pro-Islamic State media outlets on encrypted networks are currently and actively “using AI to create text-to-speech representations of ideological content from official publications”, “to supercharge the spread of their message by converting text-based propaganda into engaging multimedia narratives”.

Jihadi terrorist groups have found utility in AI for translating extremist teachings from Arabic into easily digestible, multilingual content. In the past, American imam-turned-al-Qaeda operative Anwar al-Awlaki had to personally deliver English lectures for recruitment propaganda in the Anglosphere. The CIA and FBI have Frequently cited impact Al-Awlaki’s voice is a major contagion in spreading al-Qaeda’s message.

On Rocket.Chat – the Islamic State’s preferred communications platform, which it uses to communicate with its followers and recruits – a user posted a video clip with attractive graphics and Japanese subtitles in October, commenting on the difficulties of doing so without the advent of AI.

“Japanese will be an extremely difficult language to translate into English from its original state while maintaining its eloquence,” said a pro-Islamic State user. “It should be known that I do not use artificial intelligence for any related media, with a few exceptions regarding audio.”

So far, not only the Islamic State, but various ideological groups have started using free AI applications, namely OpenAI’s chatbot, ChatGPT, to enhance their overall activities. The base and adjacent groups have used it to produce imagery, while also acknowledging the use of these tools to streamline planning and research as early as 2023.

Counterterrorism officials have always viewed the Internet and technological advancements as a constant game of keeping up with the terrorist groups that exploit them. Already Bases, Islamic State and other extremists have taken advantage of emerging technologies like crypto to anonymously raise funds and share files for 3D printed firearms.

Related Articles

Leave a Comment