Microsoft has said fake IT workers deployed by North Korea are using AI technology, including voice-altering tools, to dupe Western companies into hiring them.
A signature method of Pyongyang’s money-raising is being augmented by AI, a US tech firm said, helping to create fake names and replace stolen IDs to boost the credibility of false applicants for IT and software development jobs.
This scam typically involves state-backed fraudsters applying for remote IT work in the West using fake identities and enlisting the help of “facilitators” in the country where the targeted company is based. Once hired, they send their salaries back to Kim Jong-un’s kingdom and have even been known to threaten to release sensitive company data after being fired.
According to a blogpost from Microsoft’s threat intelligence unit, Pyongyang is using AI to increase the effectiveness of its moves.
Microsoft has listed several AI-related scams used by North Korean groups, dubbed Jasper Sleet and Coral Sleet, in line with cybersecurity analysts’ tradition of giving nicknames to anonymous groups of attackers.
The tech company said scammers had used voice-altering software during remote interviews to disguise their accents, allowing them to pass as Western candidates. They also use the AI ​​app Face Swap to insert the faces of North Korean IT employees into stolen identity documents and produce “polished” headshots for CVs.
“Jasper Sleet leverages AI across the entire attack lifecycle to identify, exploit, and abuse access at scale,” Microsoft said.
Last year, Microsoft said it had disrupted 3,000 Microsoft Outlook or Hotmail accounts used by fake North Korean IT employees.
Microsoft said the fake employees had used the AI ​​platform to generate “culturally appropriate” name lists and matching email address formats to create false identities for job applications. The company said an example prompt could be “Create a list of 100 Greek names” or “Create a list of email address formats using the name Jane Doe.”
They use AI to find job postings for software and IT-related roles on job platforms like Upwork, then use the skill requirements listed on those ads to craft more effective applications. Upwork has said it takes “aggressive action to remove bad actors from our platform”.
Microsoft said that once hired, the fake employees use AI to write emails, translate documents and generate code as they try to avoid being detected as fraud or fired for poor performance.
Companies have also been urged to conduct job interviews for IT employees over video or in person to avoid the danger. Microsoft said interviewers can recognize a deepfake video or image through a series of “tell-tales,” such as pixelation on the sides of the face, eyes, ears and glasses — and inconsistencies in how light interacts with the AI-generated face.
