Expert rejects Met police claim that study backs bias-free live facial recognition use | Facial recognition

According to a leading specialist of technology, he said that the allegations of the metropolitan police that the use of live face recognition is unintentional.
MET plans to use the largest and highest profile of LFR, but this bank holiday weekend at the Notting Hill Carnival in Western London.
The Guardian understands that despite the Equality and Human Rights Commission, which says that the use of the LFR is illegal, the Equality and Human Rights Commission will be deployed in two areas on its carnival approaches that insist on power.
New claims come from Prof Pete FusseyLeading the only independent academic examination of the use of face recognition of the police Reviewing an old LFR for Met between 2018-19And he currently advises the use of other forces in England and abroad.
Met, then says he reorganizes the use of LFR A study conducted in 2023 from the National Physical Laboratory (NPL) And now it is actually prejudiced. However, Fussey said: “The allegations about the lack of prejudice from the NPL report of Met are not confirmed by the facts in this report.”
The sensitivity of the system can be changed for LFR operation. The more sensitive it is, the more people detect, but the potential prejudice will be racial, gender and age. Zero is the most sensitive setting and the least sensitive one.
The NPL report found that there was a prejudice in a 0.56 environment. 0.6 In an environment, he found seven cases in which the people in the testing were misrepresented – a wrong positive. They all came from ethnic minorities.
After entering 178,000 images into the system, the results of the study were obtained. Four hundred volunteers passed the camera about 10 times and gave the system to 4,000 chances to recognize them correctly. The estimated study was crowded in four regions in London, as well as a crowd of crowds and a crowd in Cardiff. The tests took place in sunny conditions, and Fussey said it was shorter than other countries assessing LFR.
From this example, the report concluded that there is no statistically significant prejudice in a 0.6 or higher environment, a claim that Met to defend the use and expansion of LFR.
Fussey said that this is a very small example to support Met’s claims. “MPS [Metropolitan Police Service] Continuous claim that their systems are tested independently in terms of prejudice. The examination of this study reveals that the data is insufficient to support the allegations made.
“The decisive results of the Met are based on the analysis of seven false matches. This is from a system that analyzes millions of London faces.
Met now uses LFR in a 0.64 sensitivity setting, where the NPL work does not produce misconceptions.
Fussey said: “According to his own research, fake matches were not evaluated in the settings they claim to be without prejudice, which was actually 0.64 or higher.
“Very few in the science community would say that the evidence was sufficient to support these claims, which were made from such a small example.”
Fussey added: “The assessment clearly states that there is a prejudice in the algorithm, but it claims that this is eliminated if the system settings are changed. The problem here is not enough tested in these different environments, so it is difficult to support these claims.”
MET’s Intelligence Director Lindsey Chiswick rejected Fussey’s claims. “This is a real report of a world -famous organization. The interpretation of the Met police is based on what the independent report has found.”
After the bulletin promotion
“When we use LFR at 0.64 setting – what we are currently using – there is no statistically significant prejudice.
“We used the study to understand where the potential prejudice could be in the algorithm and to reduce this risk.
“The findings show us at what level to use the algorithm to prevent prejudice, and we always work above this level and in a fair way.”
This weekend warning signs will tell people that LFR is in use, next to the cameras connected to a database of suspects.
Police believes that its use in two areas in the approaches to Carnival will be a deterrent. In the carnival itself, the police are preparing to use retrospective face recognition to capture suspects for violence and attacks.
Fussey said: “Few people would have doubted the right to use technology to keep the people safe, but it should be done in accordance with the appropriate accountable measures and human rights standards.”
Since 2024, MET claims that the false positive rate of LFR is one in every 33,000 cases. He refused to say how many faces were scanned, but it seems to be in hundreds of thousands of people.
There were 26 fake matches in 2024, so far there were eight people in 2025. Met says that none of these people are detained, because when the computer system marked a match, the arrest warrant is given by a police officer.
Before the carnival, MET arrested 100 people, 21 were sentenced to imprisonment, and 266 of them were banned from participation. The force also said he had seized 11 firearms and more than 40 knives.




