google.com, pub-8701563775261122, DIRECT, f08c47fec0942fa0
UK

Essex police pause facial recognition camera use after study finds racial bias | Facial recognition

Police in Essex have halted the use of live facial recognition (LFR) technology after a study found cameras were far more likely to target black people than people of other ethnicities.

Move was made to suspend the use of artificial intelligence-supported systems clarified by the Information Commissioner’s Office (ICO), which regulates the use of the technology, which has so far been used by at least 13 police forces in London, south and north Wales, Leicestershire, Northamptonshire, Hampshire, Bedfordshire, Suffolk, Greater Manchester, West Yorkshire, Surrey and Sussex.

The ICO said Essex police had paused LFR deployments “after identifying potential risks to accuracy and bias” and warned other forces to take mitigating measures. LFR systems are either mounted in fixed locations or placed in pickup trucks. In January, Interior Minister Shabana Mahmood said announced The number of LFR vans will increase five-fold, with 50 in every police force in England and Wales.

Essex commissioned University of Cambridge academics to conduct research to workIt involved 188 actors passing in front of actively deployed cameras from marked police vans in Chelmsford. The results were published last week and showed that about half of the people on the watch list were correctly identified and that misidentifications were extremely rare; but the system was more likely to correctly identify men than women, showing that it was “statistically significantly more likely to correctly identify black participants than participants from other ethnic groups.”

Live facial recognition tools are becoming more widely available to police forces in England and Wales. Photo: Andrew Matthews/PA

The report concluded that this “raises questions of fairness that require ongoing monitoring.” One of its authors, criminologist Dr. Matt Bland told the Guardian and Liberty Investigates: “If you’re a criminal who goes through the facial recognition cameras installed in Essex, you’re more likely to be on the police watch list if you’re black. In my view, that warrants further investigation.”

The problem is different from the more common public concern about technology detecting innocent people. It was revealed last month that police arrested a man for theft in a city he had never visited, 100 miles (161 km) away, after retrospective facial scanning software mistook him for another person of South Asian descent.

Possible causes for the recent problem with LFR include the algorithm being overtrained on the faces of black people. Experts believe that this situation can be fixed by adjusting the system settings. A separate study of the same technology by the government’s National Physical Laboratory to create Black men were most likely to be matched correctly by the system, while white men were least likely to be matched, although the effect was not statistically significant.

Ministry of the Interior in question LFR cameras deployed in London from January 2024 to September 2025; It led to the arrest of more than 1,300 people wanted for crimes such as rape, domestic violence, theft and grievous bodily harm. But opponents of facial recognition technology have said recent research has shown that warnings about biases in LFR technology have been borne out.

“Police up and down the country should be taking notice of this fiasco,” said Jake Hurfurt, head of research and investigations at Big Brother Watch. “Experimental, untested, inaccurate or potentially biased AI surveillance has no place on our streets.”

Essex police have been contacted for comment.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button