Home Office admits facial recognition tech issue with black and Asian subjects | Facial recognition

Ministers have faced calls for stronger safeguards on the use of facial recognition technology after the Home Office admitted it was more likely to incorrectly identify black and Asian people in some settings than their white counterparts.
Following the latest testing carried out by the National Physical Laboratory (NPL) on the application of the technology to the police’s national database, the Home Office said it was “more likely to incorrectly include some demographic groups in search results”.
Police and crime commissioners said the publication of the NPL’s finding “highlighted a relevant bias” and urged caution against national expansion plans.
The findings were announced on Thursday, hours after policing minister Sarah Jones described the technology as the “biggest breakthrough since DNA matching”.
Facial recognition technology scans people’s faces and then compares the images to watch lists of known or wanted criminals. It can be used to review live footage of people passing cameras, compare their faces with those on a wanted list, or target people passing by cameras mounted by police officers.
Images of suspects can also be run back through police, passport or immigration databases to identify them and check their backgrounds.
Examining the police national database’s retroactive facial recognition technology tool in a lower setting, analysts found that “the false positive identification rate (FPIR) for white subjects (0.04%) was lower than for Asian subjects (4.0%) and black subjects (5.5%).”
The testing continued to find that the number of false positives was particularly high for black women. “The FPIR of black male subjects (0.4%) was lower than that of black female subjects (9.9%),” the report said.
The Association of Police and Crime Commissioners said in a statement that the findings showed entrenched bias. The statement said: “This means that in some cases black and Asian people are more likely to be incorrectly matched than their white counterparts. The language used is technical, but behind the details it is clear that technology has been applied to operational policing without adequate safeguards.”
The statement, signed by APCC leaders Darryl Preston, Alison Lowe, John Tizard and Chris Nelson, questioned why the findings were not published or shared with black and Asian communities at an earlier opportunity.
The statement said: “Although there is no evidence of adverse impact in any individual case, this is due to chance rather than design. System failures have been known for some time but have not been shared with affected communities or leading industry stakeholders.”
The government has announced a 10-week public consultation which it hopes will pave the way for more frequent use of the technology. The public will be asked whether police can go beyond their own records to access other databases, including passport and driving license images, to track criminals.
Officers are working with police to set up a new national facial recognition system that will accommodate millions of images.
After the newsletter launch
Charlie Whelton, head of policy and campaigns at campaign group Liberty, said: “The racial bias in these statistics shows the real-life harmful effects of allowing police to use facial recognition without appropriate safeguards. With thousands of searches a month using this discriminatory algorithm, there are now serious questions to be answered about how many people of color are misidentified and what consequences this has.”
“This report is further evidence that this powerful and opaque technology cannot be used without robust safeguards to protect us all, including true transparency and meaningful oversight. The government must halt the rapid rollout of facial recognition technology until these are available to protect each of us and prioritize our rights, which we know the public wants.”
Former cabinet minister David Davis raised concerns after police leaders said cameras could be installed in shopping centres, stadiums and transport hubs to catch wanted criminals. HE he told the Daily Mail: “Welcome to big brother Britain. It’s clear the government plans to roll out this dystopian technology across the country. Nothing of this magnitude should happen without a full and detailed debate in the House of Commons.”
Authorities say the technology is needed to help catch serious criminals. They say there are manual safeguards written into police training, operational practice and guidance that require all potential matches returned from the police’s national database to be visually assessed by a trained user and investigating officer.
A Home Office spokesman said: “The Home Office takes the findings of the report seriously and we have already taken action. A new algorithm has been independently tested and has been supplied with no statistically significant bias. It will be tested and evaluated early next year.”
“Given the importance of this issue, we have asked the police inspectorate, as well as the forensics regulator, to review law enforcement’s use of facial recognition. They will assess the effectiveness of mitigating measures supported by the National Police Chiefs’ Council.”




