Frontline workers have said AI tools are making potentially harmful errors in social work records, ranging from false warnings of suicidal ideation to simple “unclear”.
keir starmer last year champion What he called an “incredible” time-saving social work transcription technique. But research from 17 English and Scottish councils, shared with the Guardian, has now found that AI-generated hallucinations are on the rise.
As many local authorities have begun using AI note-takers to speed up the recording and summarizing of meetings with adult and child service users, an eight-month study by the Ada Lovelace Institute found that “some potentially harmful misrepresentations of people’s experiences are occurring in official care records”.
The independent thinktank found that a social worker who used an AI transcription tool to create the summaries said the technology had incorrectly “indicated that there was suicidal ideation”, but “at no point did the client actually talk about … suicidal thoughts or plans, or anything”.
Another said that the AI’s notes might mention “fish fingers or flies or trees”, when in fact a child was talking about his parents fighting. Social work experts said such errors are particularly worrisome because they could leave behind a risky pattern of behavior.
Other social workers raised concerns about inaccuracies in written conversations with people with regional accents. One described how their AI-generated transcriptions often included “unclear”. Another said: “It’s become a running joke in the office.”
Dozens of councils from Croydon to Redcar and Cleveland have given social workers access to AI transcription tools that record and summarize case conversations. The potential time savings is appealing to chronically understaffed town halls.
There is a popular system called Magic Notes sold out to councils At a cost of between £1.50 and £5 for each hour of transcription. Most social workers interviewed used either specialist Magic Notes AI or general purpose Microsoft Copilot AI.
The research also found that AI transcription saved observation time and freed up social workers to focus more on relationships with service users.
After interviews with 39 anonymous social workers it said, “Our evidence shows that these tools can also improve the relational aspects of care work and the quality of information recorded by social workers.”
But when a social worker used an AI tool to reword care documents in a more “person-centred” tone, the system “put in all these words that weren’t said”. Another social activist pointed out that technology has “crossed the line between evaluating you and evaluating AI”.
The report concluded, “AI-generated inaccuracies that enter these records can have far-reaching effects, such as a social worker making the wrong decision about a child’s care, which could lead to harm to the child and professional consequences for the social worker.”
According to the British Association of Social Workers (BASW), the impact of AI errors is already being felt across the profession, with reports of AI note takers failing to properly check their output and disciplinary action being taken for missing obvious errors. It is calling on social work regulators to issue clear guidance on how and when AI tools should be used.
While some social workers have “genuine excitement” about their potential, “these tools also introduce new risks to social work and society, from potential bias in report summaries to inaccurate ‘hallucinations’ in transcripts,” said Imogen Parker, associate director of Ada Lovelace. “These risks are not being fully assessed or mitigated, leaving frontline workers to deal with these challenges on their own.”
Social workers often receive very little AI training – only an hour in one case. While some social workers said they spent up to an hour checking the AI transcripts, others said they only spent two minutes. One said it took them “actually five minutes to quickly screen it (…) and then cut it out and stick it on the system”. Another said AI-generated cut-and-paste care plans could be “terrible.”
Others said that some coworkers were too lazy or busy to check transcripts.
“The risk here is that people are not checking what is written for them,” said Andrew Rees, BASW strategic lead for England and Wales. “The time you spend writing helps you understand what you heard. If the computer is doing that for you, you’re missing out on important parts of reflective practice.”
Beam, which operates Magic Notes, insisted that its outputs were first drafts, not final records. “AI tools are being adopted by social workers for good reason,” said its co-founder Seb Barker. “Services are overwhelmed and difficult to access; a generation of social workers is at risk of being wasted, and the need for accurate, compliant documentation is increasing.”
He said the bias assessment found that Magic Notes performed “consistently and equitably” and highlighted its expert features for social tasks, including automated hallucination risk screening. “Not all AI tools are the same, with non-specialized, low-quality, or generic tools failing to meet the specific needs of the field,” he said.
The UK government and Microsoft have been contacted for comment.
