Police federation consider legal action against Met over use of Palantir AI tool to monitor officers

A police association representing thousands of officers has said it is considering legal action against the Metropolitan Police over its use of an artificial intelligence tool developed by Palantir.
Over the weekend the Met said it had launched hundreds of investigations into suspected rogue officers flagged by the tool. Police said the three men were arrested on suspicion of criminal offenses including abuse of authority for sexual purposes, fraud, sexual assault, rape and misconduct in public office.
Meanwhile, hundreds of people have been investigated or issued with prevention notices for offenses including misuse of the rota system for financial or personal gain and breaches of the force’s hybrid working policy, it continued.
Responding on Monday, the Metropolitan Police Federation (MPF) described the use of AI technology as an “outrageous and inexcusable invasion of privacy” and said legal action would be taken against the force.

The association, which represents more than 30,000 of the Met’s frontline police officers in London, warned officers against using work devices while off duty and said they had not been informed that artificial intelligence would be used “to analyze the movements of officers in the capital”.
It comes after the Met said it would use Palantir’s technology to detect potential concerns in professional standards following a series of high-profile cases involving its officers, including Met Police officer Wayne Couzens, who was jailed for life after kidnapping, raping and murdering Sarah Everard in September 2021.
The force said the technology allowed it to bring together data it already legally held in one place to “identify potential standards, welfare or cultural concerns.” But the MPF warned that the use of artificial intelligence would “seriously damage” police officers’ trust in the force.
Chief Secretary Matt Cane said police officers had a right to privacy and questioned the “checks and balances” the force had in place.

“This use of AI will seriously undermine the confidence of Metropolitan Police officers in the force and will drive and ride despite already falling morale,” he said, adding that the use of AI for “espionage” was “not proportionate, fair or appropriate”.
“Nobody wants bad police officers in policing,” he said. “But this use of AI to spy on our officers is not proportionate, fair or appropriate. It is an outrageous and inexcusable invasion of privacy.”
He added that the federation was aware of plans to upgrade the force’s existing statutory business monitoring software, but “was never informed that the upgrade would involve the deployment of Palantir’s artificial intelligence”.
“This constant 24/7 geolocation tracking is highly intrusive and carries the risk of officers being tracked while they are off duty, on rest days or at home. The presumption of abuse of power and invasion of the officer’s personal life is unacceptable.”
“Many officers are unaware of the full extent of this monitoring and how the AI system analyzes location data. There is a clear risk that this information could be misused to question overtime claims, sickness absences, performance or behavior without appropriate fact-finding or context.”
“Overall, the draconian approach raises significant legal and privacy concerns regarding proportionality, GDPR compliance and the right to privacy under Article 8 of the Human Rights Act.”
Mr Cane added: “The Federation now advises all members to be extremely careful about carrying Metropolitan Police-issued devices when off duty.
“The Federation is seeking urgent legal advice on these matters and will provide further guidance to members where necessary.”
Surveillance giant Palantir is known for providing analysis to the US Military, New York Police Department and ICE. The company was co-founded by billionaire tech entrepreneur Peter Thiel, an early supporter of US president Donald Trump.
The Metropolitan Police and Palantir have been contacted for comment.




