google.com, pub-8701563775261122, DIRECT, f08c47fec0942fa0
UK

Armed police handcuff teen after AI mistakes crisp packet for gun in US

A US teenager was handcuffed by armed police after an artificial intelligence (AI) system mistakenly said he was carrying a gun; whereas in reality he was holding a bag of chips.

“The police showed up, like eight cop cars, and then they all came out pointing guns at me and talking about getting on the ground,” said Taki Allen, a 16-year-old Baltimore student. he told local outlet WMAR-2 News.

The Baltimore County Police Department said its officers “responded appropriately and proportionately based on the information provided at the time.”

He said the AI ​​alert was sent to human reviewers, who found no threat, but the principal overlooked it and contacted the school’s security team, who eventually called the police.

But the incident has led some to call for a review of schools’ procedures for using such technology.

Mr. Allen told local news that he drank a pack of Doritos after football practice and put the empty pack in his pocket.

He said armed police arrived 20 minutes later.

“He told me to get on my knees, he arrested me and handcuffed me,” he said.

The Baltimore County Police Department told BBC News that Mr. Allen was handcuffed but not arrested.

“The incident was resolved safely after it was determined that there was no threat,” the statement said.

Mr. Allen said he now waits inside after football practice because he doesn’t think it’s “safe enough to go out, especially to have a bag of chips or a drink.”

In a letter to parents, principal Kate Smith said the school’s security team “quickly reviewed and rescinded the initial warning after confirming there was no weapon.”

“I contacted our school resource officer (SRO) and reported the issue to him, who in turn contacted the local district for additional support,” he said.

“Police officers responded to the school, searched the individual and immediately confirmed they were not in possession of any weapons.”

However, local politicians called for further investigation into the incident.

“I’m calling on Baltimore County Public Schools to review procedures around the AI-assisted weapons detection system,” said Baltimore County council member Izzy Pakota. he wrote on Facebook.

Omnilert, the provider of the AI ​​tool, told BBC News: “We regret that this incident occurred and would like to express our concern to the student and the wider community affected by the events that followed.”

He said his system initially detected what appeared to be a firearm, and its image was later confirmed by the investigation team.

This was then forwarded to the Baltimore County Public Schools (BCPS) security team with more information “within seconds” for evaluation, Omnilert said.

The security firm added that its involvement in the incident ended after it was marked as resolved in its system, and that it generally “worked as designed.”

“Although the object was later determined not to be a firearm, the process worked as intended: prioritizing safety and awareness through rapid human verification,” he said.

Omnilert says it’s a “leading provider” of AI weapons detection, citing a number of US schools among its case studies on its website.

“Weapons detection in the real world is complex,” he says.

But Mr Allen said: “I don’t think any chip bag should be mistaken for a gun.”

The ability of artificial intelligence to accurately identify weapons has been subject to scrutiny.

Last year, US gun scanning company Evolv Technology banned from making unsupported claims about their products He said that the artificial intelligence scanner used at the entrances of thousands of schools, hospitals and stadiums in the USA can detect all weapons.

BBC News investigations showed that these allegations were false.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button