Police AI chief admits crime-fighting tech will have bias but vows to tackle it | AI (artificial intelligence)

A police chief has admitted that artificial intelligence being used to boost crime-fighting will involve bias but has vowed to tackle the risks.
Labor wants to dramatically expand police use of AI in England and Wales; Police chiefs also believe it could help law enforcement keep up to date on new crime threats.
Alex Murray told the Guardian that a new national police AI center would recognize risks of bias and minimize them.
Bias in the use of AI in policing can cause algorithms trained on historical data, which often reflect past human biases, to produce systematically unfair results, such as over-targeting minority communities or misidentifying individuals based on race, gender, or socioeconomic status.
Murray, the National Crime Agency’s director of threat leadership and national lead for AI, said: “Once you recognize it and minimize it [bias]How do you train officers to deal with output to ensure output is further minimized?
“If you talk about live facial recognition or predictive policing, there will be bias and you need to engage data scientists and data engineers to clean the data, train the model appropriately and then test it.
“There is no point in leaving something with unrecognized bias to the police and everything must be done to minimize this to a level where it can be understood and mitigated.”
Instances of bias in police use of AI-powered retrospective facial recognition have already surfaced. This is where a suspect is compared to a database of images after the crime.
More controversial and less used by police, live facial recognition hunts for suspects in real time and also involves bias. A report published in December found that the retroactive facial recognition system used by police was being used with inadequate security measures.
The Association of Police and Crime Commissioners (APCC), which oversees local forces in England and Wales, said: “System failures have been known for some time, but these have not been shared with affected communities or leading industry stakeholders.”
APCC forensic science lead Darryl Preston, who is the police and crime commissioner for Cambridgeshire, said: “The discovery of a built-in bias in the police national database’s retrospective facial recognition system highlights the need for independent oversight of these powerful tools, even in limited circumstances.
“The use of technology is unacceptable unless it has been thoroughly tested to eliminate bias. This was clearly not the case in this example.”
Murray said the new national artificial intelligence centre, which will cost £115 million, will aim to reduce bias as well as assess and decide which products from private suppliers work. Currently the powers in the UK each make their own decisions, which are seen as slow and wasteful.
Murray said police were in an “arms race” with criminals using the technology: “Anyone with imagination can use AI.”
In one case, a pedophile claimed that the images implicating him in child abuse were fake; police later had to disprove this claim to convict him.
Murray said the benefits of AI go far beyond “clichés about Minority Report and predictive policing.”
He added that AI has ranged from helpful to game-changing across the range of crimes and challenges facing policing, but that a human police officer must make the final decisions about the outcomes AI produces.
He said this could help police combat political agitators who spread fake images on social media to trigger violence on the streets.
Over time, this could aid manhunts or speed up the search for cars linked to suspects, saving hundreds of hours detectives spend combing through extensive CCTV footage or speeding up the search for suspects’ seized digital devices in the search for incriminating evidence, Murray said.
“Things that take days, weeks, sometimes months can potentially take hours,” he said.
In a recent incident, four suspects from Luton were arrested for assault and theft at a cash machine. Police downloaded the data from the suspects’ phones and, thanks to artificial intelligence, secured a criminal complaint within a few weeks.
The data was in Romanian and the AI scanned it, translated it, identified material related to possible crimes, identified the crimes and presented it all in one package to the detectives.
Bedfordshire police chief Trevor Rodenhurst told the Guardian: “This allowed us to obtain evidence from a large number of devices containing large amounts of data that we would not have been able to do otherwise.”
Rodenhurst said that as officers use AI and see its benefits, it changes the perspective of those on the front lines: “They’re no longer doubting, they’re asking when they can have it. This capability is transformative.”




