Law enforcement has increasingly adopted AI for just about everything Drafting a Police Report For facial recognition.
The results have been predictably disappointing. In one particularly glaring – and unintentionally humorous – example, the police department in Heber City, Utah was forced to explain why a police report software declared that an officer had somehow turned into a frog.
As located in Salt Lake City fox 13 reportsIt seems the flawed tool has picked up some unrelated background conversation to create its imaginary fairy tale ending.
“The body cam software and the AI report writing software picked up the movie playing in the background, which was ‘The Princess and the Frog,'” Police Sergeant Rick Keel told the broadcaster, referring to Disney’s 2009 musical comedy. “That’s when we learned the importance of getting these AI-generated reports right.”
The department had started testing AI-powered software called Draft One to automatically generate police reports from body camera footage. The goal was to reduce the amount of paperwork – but given that huge mistakes are being made, the results vary markedly.
Even a simple mock traffic stop meant to demonstrate that the equipment is capable of turning into a disaster. According to the resulting report, a lot of reforms were needed fox 13,
Despite the shortcomings, Keel told the outlet that the tool is saving him “six to eight hours weekly now.”
“I’m not the most tech-savvy person, so it’s very user-friendly,” he adds.
Draft One was first announced by police tech company Axon – the company behind the Taser, a popular electroshock weapon – last year. The software uses OpenAI’s GPT large language model to generate complete police reports from body camera audio.
Experts immediately warned that cracks in these important documents could cause hallucinations.
“I worry that automation and the ease of technology will cause police officers to be less careful in their writing,” said Andrew Ferguson, a law professor at American University. told associated Press Last year.
Others have warned that the software could exacerbate already existing racial and gender biases, a troubling prospect given that law enforcement historical role in maintaining them Long before the advent of AI. Generative AI tools have also been Bias has been shown to persist Against both women and non-white people.
“The fact that the technology is being used by the same company that provides Tasers to the department is quite concerning,” said Aurelius Francisco, co-founder of the Foundation for Liberating Minds in Oklahoma City. AP,
Critics also argue that the tool can be used to introduce disempowerment and make officials less accountable if mistakes are made. according to a recent investigation According to the Electronic Frontier Foundation, Draft One “seems to be deliberately designed to avoid audits that could provide any accountability to the public.”
According to records obtained by the group, “It is often impossible to tell which parts of a police report were generated by an AI and which parts were written by an officer.”
“Axon and its customers claim this technology will revolutionize policing, but it remains to be seen how it will transform the criminal justice system, and who this technology benefits most,” the foundation said. wrote,
The Heber City Police Department has not yet decided whether it will continue using Draft One. The department is also testing a competing AI software called Code Four, which was released earlier this year.
But given Draft One’s inability to distinguish between reality and the fantasy world envisioned by Disney, let’s hope the department thinks long and seriously about the decision.
More on AI Policing: AI is disturbing police radio chatter, posting it online as ridiculous misinformation