UK police forces lobbied to use biased facial recognition technology | Facial recognition

Police forces have successfully lobbied for the use of a facial recognition system known to be biased against women, young people and members of ethnic minority groups, after complaining that another version produced fewer potential suspects.
UK forces use the police national database (PND) to conduct retrospective facial recognition searches; This allows a suspect’s “probe image” to be compared to a database of more than 19 million custody photos for potential matches.
The Home Office last week acknowledged the technology was biased and said it was “acting on the findings” after a review by the National Physical Laboratory (NPL) found the technology misidentified Black and Asian people and women at significantly higher rates than white men.
Documents seen by the Guardian and Liberty Investigates reveal the bias was known for more than a year and police forces debated overturning an initial decision designed to address it.
Police bosses were told the system was biased after a review of the NPL commissioned by the Home Office in September 2024 found it was more likely to suggest false matches for investigation images depicting women, black people and those aged 40 and under.
The National Police Chiefs’ Council (NPCC) has ordered that the confidence threshold required for potential matches be increased to a level where bias will be significantly reduced.
This decision was reversed the following month after forces complained that the system was producing fewer “investigative leads”. NPCC documents show that the higher threshold reduces the number of searches resulting in potential matches from 56% to 14%.
Although the Home Office and NPCC currently refuse to say what threshold is used, the latest NPL study found that the system can produce false positives for Black women almost 100 times more often than for white women in certain settings.
In publishing these results, the Home Office said: “Testing identified that in a limited number of circumstances the algorithm was more likely to incorrectly include some demographic groups in search results.”
Explaining the impact of the short-term increase on the system’s trust threshold, the NPCC describes the threshold change sought by police forces as saying: “The change significantly reduced the impact of bias on protected characteristics such as race, age and gender, but had a significant negative impact on operational effectiveness,” adding that forces complained that “the once effective tactic was of limited benefit.”
After the newsletter launch
The government has launched a ten-week consultation on plans to expand the use of facial recognition technology.
Police minister Sarah Jones described the technology as “the biggest breakthrough since DNA matching”.
Prof Pete Fussey, a former independent investigator who examined the Met’s use of facial recognition, said he was concerned about police forces’ apparent priorities.
He said: “This raises the question of whether facial recognition will only be useful if users accept biases around ethnicity and gender. Convenience is a weak argument for trumping fundamental rights and is unlikely to withstand legal scrutiny.”
Abimbola Johnson, chair of the police’s independent review and oversight board into the racing action plan, said: “There has been little discussion at racing action plan meetings about the rollout of facial recognition, despite it clearly aligning with the concerns of the plan.
“These revelations show once again that the anti-racism commitments the police force has made through its race action plan have not been translated into wider practice. Our reports have warned that new technologies are being introduced in an environment where racial inequalities, poor investigations and poor data collection already persist.”
“Any use of facial recognition must meet stringent national standards, be independently reviewed, and demonstrate that it reduces rather than increases racial inequality.”
A Home Office spokesman said: “The Home Office takes the findings of the report seriously and we have already taken action. A new algorithm has been independently tested and has been supplied with no statistically significant bias. It will be tested and evaluated early next year.”
“Our priority is to protect the public. This game-changing technology will support police in putting criminals and rapists behind bars. There is human involvement at every step of the process and no further action will be taken until trained officers carefully review the results.”




