google.com, pub-8701563775261122, DIRECT, f08c47fec0942fa0
UK

AI tools make potentially harmful errors in social work records, research says | Social work

Frontline workers said AI tools were making potentially harmful errors in social service records, from fake warnings of suicidal ideation to simple “gibberish.”

Keir Starmer last year champion what he calls “incredibly” time-saving social work transcription technology. However, the research shared by 17 English and Scottish councils with the Guardian revealed that hallucinations produced by artificial intelligence are now slowly emerging.

As scores of local authorities begin using AI note-taking devices to speed up recording and summarizing meetings with adult and child service users, an eight-month investigation by the Ada Lovelace Institute found that “some potentially harmful misrepresentations of people’s experiences are occurring in official care records”.

The independent think tank found that a social worker who used an AI transcription tool to create a summary said the technology incorrectly “indicated suicidal ideation” but “at no point did the client actually talk about suicidal ideation or planning or anything.”

Another said the AI’s notes could actually refer to “fish fingers, flies, or trees” when a child talks about their parents fighting. Social workers said such disruptions are particularly concerning because they can cause a pattern of risky behavior to be overlooked.

Other social workers have expressed concern about inaccuracies in transcribing conversations with people with regional accents. One described how AI-generated transcriptions often contain “gibberish words.” Another said: “This has become a joke in the office.”

Dozens of councils from Croydon to Redcar and Cleveland have given social workers access to AI transcription tools that record and summarize case conversations. The potential time savings are attractive to municipal buildings with chronic staff shortages.

A popular system called Magic Notes, sold to municipalities cost between £1.50 and £5 for each hour of transcription. Most social workers interviewed were using either specialist Magic Notes AI or general-purpose Microsoft Copilot AI.

The research also found that AI transcription resulted in observable time savings, enabling social workers to focus more on relationships with service users.

“Our evidence suggests that these tools can also improve the relational aspects of care work and the quality of information recorded by social workers,” he said after interviews with 39 anonymous social workers.

But when a social worker used an AI tool to redesign care documents with a more “person-centered” tone, the system added “all these unsaid words.” Another social worker reported that the technology “crosses the line between your assessment and the AI’s assessment.”

“AI-driven inaccuracies in these records could have far-reaching effects, such as a social worker making the wrong decision about a child’s care, which could harm the child and lead to professional consequences for the social worker,” the report concludes.

According to the British Association of Social Workers (BASW), the impact of AI bugs is already being felt across the profession; There have been reports of AI note takers being disciplined for not properly checking their output and missing obvious errors. It calls on social care regulators to issue clear guidance on how and when AI tools should be used.

Imogen Parker, associate director of Ada Lovelace, said that while there was “genuine excitement” among some social workers about their potential, “these tools also introduce new risks to social work and society, from possible bias in report summaries to erroneous ‘hallucinations’ in transcripts.” “These risks are not fully assessed or mitigated, leaving frontline workers to deal with these challenges on their own.”

Social workers often receive little AI training; only one hour in one case. Some social workers said they spent up to an hour checking AI transcripts, while others said they spent just two minutes. One said they spent “literally five minutes just scanning quickly” […] and then cut and paste into the system”. Another said cut-and-paste maintenance plans created by AI can be “terrible.”

Others said some colleagues were too lazy or busy to check transcripts.

“The risk here is that people don’t check what is written for them,” said Andrew Reece, BASW’s strategic leader for England and Wales. “Spending time writing helps you make sense of what you hear. If the computer is doing this for you, you’re missing important parts of reflective practice.”

Beam, which operates Magic Notes, has emphasized that their output is a first draft, not a final record. “AI tools are being adopted by social workers for good reason,” said co-founder Seb Barker. “Services are busy and difficult to access, a generation of social workers is at risk of burnout, and the need for accurate, compliant documentation is growing.”

It said its bias assessment found Magic Notes performed in a “consistent and equitable” manner and highlighted its specialist features for social work, including automatic hallucination risk controls. “Not all AI tools are the same; non-specialized, low-quality or generic tools do not meet the specific needs of the industry,” he said.

The UK government and Microsoft have been approached for comment.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button