Home Office use of AI in asylum cases could be unlawful, legal experts warn

Legal experts have warned that it could be illegal for the Home Office to use artificial intelligence (AI) when processing asylum claims, paving the way for legal action against the government.
Caseworkers at the Home Office are using artificial intelligence to summarize recordings of interviews with asylum seekers. It is also used to seek policy guidance, such as information on whether a country is safe to return to.
Asylum seekers are not told when AI is being used in their interview statements and instead are left unaware of the technology’s impact on their claims.
The government’s own evaluation of the AI tool summarizing asylum interview transcripts found that 9 per cent of the summaries were so flawed that they had to be removed.
Five percent of caseworkers who use AI to summarize policy documents said they were “unconfident about the accuracy of the tool.” An evaluation of a 2024 pilot of the technology suggested that AI interview summaries could save 23 minutes per case, and 37 minutes per case when authorities use AI to search for information about a migrant’s country of origin.
Legal opinion submitted to and seen by solicitors at Cloisters Chambers and Doughty Street Chambers for the Open Rights Group IndependentIt argues that the Home Office’s use of AI is “likely to be unlawful” because it does not meet a set of legal obligations or standards set out in the government’s AI playbook.
These include being transparent with the public about how AI is used and ensuring that alternatives are considered before using these tools.
The AI tools were first trialled by Home Office caseworkers as part of a pilot program in 2024, but were later rolled out more widely in 2025. In an announcement in April 2025, then home secretary Yvette Cooper promised that AI would help authorities make quick decisions on requests and “prevent asylum seekers from being left in limbo at the expense of taxpayers”.
There is no publicly available data on how many asylum requests have been decided with the help of artificial intelligence.
While the backlog of initial asylum decisions decreased under the Labor Party, the number of people waiting to apply for asylum exploded.
New court statistics released last week show more than 100,000 people are awaiting appeals against their asylum decisions by the end of December 2025. According to an analysis of data by the charity Refugee Council, around 36 per cent of appeals identified are successful, and this rises to 66 per cent when Home Office reassessments are included.
Imran Hussain, director of external affairs at the Refugee Council, said the figures showed “the poor quality of the Home Office’s decision-making process”.

In the legal opinion, lawyers argue that the government has failed to put in place safeguards that would ensure “meaningful human control” over AI tools and failed to adequately consider how decisions are affected by AI content.
They warn that the adoption of AI tools risks decision-makers taking into account misinformation and missing relevant facts when determining an asylum seeker’s claim.
They also argue that the government has failed to comply with some of the ethical principles to which ministers have committed. These include commitments to transparency and the fair treatment of people with protected characteristics such as gender, race or disability.
Independent has previously reported warnings about the Home Office’s plan to use AI facial recognition technology to assess the age of unaccompanied asylum-seeking children.
Robin Allen KC and Dee Masters of Cloisters Chambers, who helped produce the legal opinion, said: “Where AI tools are used without adequate safeguards, there is a real risk of illegal or unfair decisions.” They called for “full transparency” on how AI is used.
Sara Alsherif, director of Open Rights Group’s immigrant rights program, called for “an immediate ban on the use of these tools,” adding that “these tools are not the solution.” The group believes the legal opinion could pave the way for legal challenges against the government by asylum seekers affected by the use of artificial intelligence.
He continued: “Determining whether someone can seek asylum in the UK is one of the most serious and life-changing decisions the government can make. “There must be the highest level of transparency, fairness and accuracy.
“However, asylum seekers are not even informed that opaque artificial intelligence tools are being used to assess their situation or that they are given the opportunity to correct any errors that may be made.”
A Home Office spokesman said: “AI will not be deciding asylum claims. It will strengthen our support to social workers and enable faster, high-quality decisions to be made by trained officials.”




