Illustration by Tag Hartman-Simkins/Futurism. Source: Ashley MacIsaac
Who needs vicious music columnists when you live in the age of AI?
Apparently not Ashley MacIsaac, the Canadian fiddler, singer and songwriter who was labeled a sex offender by Google’s AI overview.
According to canadian newspaper The Globe and MailEvent organizers at Sipkenkatik First Nation, north of Halifax, canceled MacIsaac’s upcoming performance after Google incorrectly identified him as a sex offender.
The paper reports that the misinformation was the result of one of Google’s AI summaries – the summary it superimposes above all other search results – which mixed up the musician’s biography with that of another person with the same name.
“Google messed up and it put me in a dangerous situation,” MacIsaac told the newspaper.
Although the AI overview has been updated, MacIsaac explained that the situation as a touring musician presents a major dilemma for him. For one thing, there’s no telling how many other event organizers hired him because of the outrageous claim, or how many potential viewers got the wrong impression but were not corrected.
“People should be aware that they should check their online presence to see if anyone else’s name has appeared,” MacIsaac said. globe,
After the truth came out, the Sipeknekatik First Nation apologized and welcomed the musician into the future.
“We deeply regret the damage this mistake has caused to your reputation, your livelihood and your sense of personal safety,” a First Nation spokesperson wrote in a letter shared with the newspaper. “It is important for us to clearly state that this situation was the result of misidentification caused by AI error, and is not a reflection of who you are.”
Meanwhile, a Google representative said that “Search, including AI overviews, is dynamic and changes frequently to show the most useful information. When issues arise – such as if our features misinterpret web content or miss some context – we use those instances to improve our systems, and may take action under our policies.”
Yet as MacIsaac correctly claims, reputation risk is a hard thing to fix. There’s no telling how far that misinformation will spread — and when a corporation launches lazy software with obvious flaws, who’s responsible for the damage?
More on Google: Google caught replacing news headlines with AI-generated nonsense
