How artificial intelligence is changing the reports police write | Artificial intelligence (AI)

by Pelican Press
2 views 14 minutes read

How artificial intelligence is changing the reports police write | Artificial intelligence (AI)

Officer Wendy Venegas spoke softly in Spanish to the 14-year-old standing on the side of a narrow residential road in East Palo Alto. The girl’s face was puffy from crying as she quietly explained what had happened.

The girl said her father had caught her and her boyfriend “doing stuff” that morning, and her dad had either struck or pushed the boy, Venegas later explained. Now, the police had arrived to interview all three of them. So far, this was all standard procedure.

But when it came time to turn this incident into a report, Venegas would have help from a new assistant: a cutting-edge artificial intelligence tool called Draft One.

East Palo Alto, a small working-class city that can feel a world away from its Silicon Valley neighbors, is among a handful of California departments, including Campbell, San Mateo, Bishop and Fresno, that have started to use or test the AI-powered software developed by Axon, an industry leader in body cameras and tasers. Axon said the program can help officers produce more objective reports in less time. But as more agencies adopt these kinds of tools, some experts wonder if they give artificial intelligence too big a part in the criminal justice system.

“We forget that that document plays a really central role in decisions that change people’s lives,” said Andrew Ferguson, a criminal law professor at American University Washington College of Law who wrote the first law review article on AI-assisted police reports, which he expects to publish next year.

From documenting the details of complex homicides to recording the basics of a stolen bicycle, police reports have been at the heart of police work.

“They actually are kind of the building block of the criminal justice system because they are the official sort of memorialization of what happened, when, and sometimes why,” Ferguson said.

Prosecutors make charging decisions, judges make bail decisions and people make decisions about their own defense based – at least in part – on what is on this initial piece of paper.

“If part of that is being shaped by AI, it raises some real concerns about whether we can rely on it,” Ferguson said. The potential for error or bias introduced by AI is still being studied. But, he added, law enforcement leaders have an understandable desire to improve efficiency.

Axon is marketing its Draft One tool as a force multiplier, which is attractive to many police departments struggling to recruit and retain officers, a crisis that many in law enforcement say was exacerbated by the murder of George Floyd and the subsequent protests.

“The pendulum is swinging, but many of them are still 15, 20% below their targeted force numbers,” Axon founder Rick Smith said on an August earnings call with shareholders. “And so that’s where we’re hearing this sort of magical feedback, where they’re like, ‘Man, with Draft One, if it’s freeing up 20, 25% of my officers’ day from writing reports, that’s almost like a 20% bump in my force power overnight.’”

East Palo Alto chief Jeff Liu said his agency isn’t immune to these staffing concerns. The department is budgeted for 36 sworn officers, including the chief, but he’s currently short eight positions. He doesn’t see Draft One as a turnkey solution but hopes it can help officers spend more time on the streets.

“If this AI is going to speed up the reports, but without compromising accuracy, I think it’s a win,” he said. East Palo Alto’s contract shows Draft One costs the city about $70 per body camera per month or about $40,000 per year.

Liu said although he doesn’t write many reports these days, he does use ChatGPT to draft social media posts and even condolence letters, which he then customizes in his own voice. Working with the popular AI chatbot made him more open to Draft One, he said. Draft One uses the same underlying AI as ChatGPT, but departmental data is stored on a secure government cloud service developed by Microsoft.

Axon is not the only company that offers this service. Truleo – a company that uses AI to analyze vast swaths of bodycam footage to make sure officers are acting professionally – offers a similar report-writing program, but it hasn’t been marketed or adopted as widely as Axon’s Draft One.

In nearby Santa Clara county, Campbell police captain Ian White said that in his department’s first month of testing Draft One, officers said it saved them about 50 hours overall. The police department in Fort Collins, Colorado, found that, on average, reports written with Draft One were produced in eight minutes, while those not using the software took 23 minutes.

But the first independent study of the Draft One software, published this week in the Journal of Experimental Criminology, did not corroborate the time savings that White and others reported. Researchers at the University of South Carolina did a randomized controlled trial with a New Hampshire police department over the past year, which found officers who used Draft One did not write reports any faster than those in the control group.

Assistant professor Ian Adams, who led the study, said he can’t yet draw conclusions about why there were no time savings – he and his colleagues are still looking into that – but he said the results surprised him. He also cautioned against giving his findings too much weight. “It is in one agency, about one outcome at one point in time,” Adams said.

His team is still researching whether there could be other benefits like accuracy or completeness. If technology like Draft One can produce better-written reports, he said, “maybe they get returned for editing or revision less [often], and so you could maybe see systemic savings”.

Officer Venegas in East Palo Alto said the program helps her overcome writer’s block, especially after a long day on patrol. She can just push the Draft One button on her computer, and a narrative based on the audio transcript of her bodycam footage appears within seconds.

Officer Wendy Venegas at the East Palo Alto police headquarters. Photograph: Martin do Nascimento/KQED

“When you don’t know what words you’re trying to write, and then you just look, and you’re like, ‘Oh, that’s exactly what I was thinking!’ That’s the best,” she said. Draft One is also changing the way she works in the field. Because the report is based on the audio transcript, Venegas said she will purposefully talk about what is happening during an incident.

“I’ll be like, ‘Did you see that? The mirror is broken,’” Venegas said. “‘Did you see that? There’s stuff on the floor. The knife, the bloody knife, is on the floor.’”

Axon product designer Noah Spitzer-Williams said this was one of the most surprising and fascinating side-effects of the software: it incentivizes officers to be more verbal overall, even talking into their camera’s microphone to provide context – like the parole status of a subject or whether a weapon has been reported before arriving at a scene – so the audio transcript contains key details that Draft One puts in the report.

“Then, during the interaction, the officer is asking more questions,” Spitzer-Williams said. “They’re echoing back statements like, ‘OK, Jimmy. You’re giving me consent to search your backpack.’”

Spitzer-Williams said this also helps community relations because officers are explaining what they’re doing and why.

But research by the American Civil Liberties Union shows the ways officers’ real-time narration has also been used to manipulate evidence. A common example is when officers shout “stop resisting” to justify use of force even when the individual is complying. Axon’s Spitzer-Williams said he doesn’t believe Draft One will make this “real concern” any worse.

Spitzer-Williams also pointed to an Axon study that found reports written by the software tended to use less biased language than reports written by officers.

Back in a conference room at the East Palo Alto police department, Venegas read from her AI-generated report. “September 23 2024, at approximately 10.49, I, Officer Venegas, responded to a call for service involving a minor and a domestic disturbance.’”

The program produced a good rough draft, she said, but it has some limitations. At this point, Draft One only understands English, so it got some things wrong, like who was speaking and who was related to whom.

“Sometimes it makes small mistakes like that,” Venegas said, “which are easy to correct.”

Some in the criminal justice sector said these seemingly small mistakes point to bigger questions of authorship, which can become critical in the adjudication process. Dr Matthew Guariglia, a policy analyst for the Electronic Frontier Foundation, said he’s concerned these reports are “going to destroy the ability to cross-examine officers. Because if an officer is caught in a lie on the stand, they can always just say, ‘Well, the AI wrote that.’”

White of the Campbell police department, which has been using the software since May, said his department’s policy ensures officers take responsibility for authoring the reports produced by Draft One. Even if an officer “screwed up” and an error was introduced into the report, he said, it should be easily resolvable by reviewing what he called the “gold standard” of evidence: the bodycam video.

“There’s error in any human activity,” White said. He thinks AI will make things more accurate, not less.

San Mateo county assistant district attorney Rebecca Baum, who has been talking with East Palo Alto and San Mateo police departments about their shift toward AI-assisted reports, said her office is cautiously optimistic about the new program. Her chief concern is that body cameras, and especially body-camera audio, don’t capture everything that happens during an incident.

“Witness demeanor, if someone’s under the influence of drugs or alcohol, if there’s injuries,” Baum said, “this is not coming through from an audio recording of the body camera.”

Officers need to remain engaged in the process of writing these reports, Baum said, so they don’t leave something out – particularly information that shows a person might be innocent, which prosecutors have a duty to turn over.

For its part, Axon has also built a number of safeguards into the application to ensure that an officer goes through the report in detail to proofread it and make sure it’s accurate.

On the computer screen, Officer Venegas demonstrated how each paragraph of the report includes comments that need to be resolved before she can move the report out of Draft One and into the departmental system.

And at the very bottom of the screen, Venegas pointed to a box she has to click, a final step that Spitzer-Williams, of Axon, said is “arguably the most important” given the potential ramifications of each report.

“I acknowledge that this report was generated using Draft One by Axon,” Venegas read aloud. “And that I further acknowledge that I have viewed the report in detail, made any necessary edits, and believe it to be an accurate representation of my recollection of the reported events. If needed, I’m willing to testify to the accuracy of this report.”

This story is a collaboration between KQED, Guardian US, and the California Newsroom



Source link

#artificial #intelligence #changing #reports #police #write #Artificial #intelligence

You may also like