A.I. “ShotSpotter” Conviction Overturned Due to “Scant Evidence” After Man Spends Almost One Year in Prison

Artificial Intelligence (A.I.) is NOT always accurate.  Examples continue to be reported (see 12).  There is even an A.I. “Hall of Shame”.  Experts frequently warn about using this technology and rightfully so (see 123).  Some have been accused and convicted of crimes based on inaccuracies (see 12) and “scant evidence”.

More from the Associated Press:


How AI-powered tech landed man in jail with scant evidence

CHICAGO (AP) — Michael Williams’ wife pleaded with him to remember their fishing trips with the grandchildren, how he used to braid her hair, anything to jar him back to his world outside the concrete walls of Cook County Jail.

His three daily calls to her had become a lifeline, but when they dwindled to two, then one, then only a few a week, the 65-year-old Williams felt he couldn’t go on. He made plans to take his life with a stash of pills he had stockpiled in his dormitory.

Williams was jailed last August, accused of killing a young man from the neighborhood who asked him for a ride during a night of unrest over police brutality in May.

But the key evidence against Williams didn’t come from an eyewitness or an informant; it came from a clip of noiseless security video showing a car driving through an intersection, and a loud bang picked up by a network of surveillance microphones. Prosecutors said technology powered by a secret algorithm that analyzed noises detected by the sensors indicated Williams shot and killed the man.

“I kept trying to figure out, how can they get away with using the technology like that against me?” said Williams, speaking publicly for the first time about his ordeal. “That’s not fair.”

Williams sat behind bars for nearly a year before a judge dismissed the case against him last month at the request of prosecutors, who said they had insufficient evidence.

ShotSpotter equipment overlooks the intersection of South Stony Island Avenue and East 63rd Street in Chicago on Tuesday, Aug. 10, 2021. (AP Photo/Charles Rex Arbogast)

Williams’ experience highlights the real-world impacts of society’s growing reliance on algorithms to help make consequential decisions about many aspects of public life. Nowhere is this more apparent than in law enforcement, which has turned to technology companies like gunshot detection firm ShotSpotter to battle crime.

ShotSpotter evidence has increasingly been admitted in court cases around the country, now totaling some 200. ShotSpotter’s website says it’s “a leader in precision policing technology solutions” that helps stop gun violence by using “sensors, algorithms and artificial intelligence” to classify 14 million sounds in its proprietary database as gunshots or something else.

But an Associated Press investigation, based on a review of thousands of internal documents, emails, presentations and confidential contracts, along with interviews with dozens of public defenders in communities where ShotSpotter has been deployed, has identified a number of serious flaws in using ShotSpotter as evidentiary support for prosecutors.

AP’s investigation found the system can miss live gunfire right under its microphones, or misclassify the sounds of fireworks or cars backfiring as gunshots. Forensic reports prepared by ShotSpotter’s employees have been used in court to improperly claim that a defendant shot at police, or provide questionable counts of the number of shots allegedly fired by defendants. Judges in a number of cases have thrown out the evidence.

ShotSpotter’s proprietary algorithms are the company’s primary selling point, and it frequently touts the technology in marketing materials as virtually foolproof. But the company guards how its closed system works as a trade secret, a black box largely inscrutable to the public, jurors and police oversight boards.

The company’s methods for identifying gunshots aren’t always guided solely by the technology. ShotSpotter employees can, and often do, change the source of sounds picked up by its sensors after listening to audio recordings, introducing the possibility of human bias into the gunshot detection algorithm. Employees can and do modify the location or number of shots fired at the request of police, according to court records. And in the past, city dispatchers or police themselves could also make some of these changes.

Amid a nationwide debate over racial bias in policing, privacy and civil rights advocates say ShotSpotter’s system and other algorithm-based technologies used to set everything from prison sentences to probation rules lack transparency and oversight and show why the criminal justice system shouldn’t outsource some of society’s weightiest decisions to computer code.

When pressed about potential errors from the company’s algorithm, ShotSpotter CEO Ralph Clark declined to discuss specifics about their use of artificial intelligence, saying it’s “not really relevant.”

“The point is anything that ultimately gets produced as a gunshot has to have eyes and ears on it,” said Clark in an interview. “Human eyes and ears, OK?”
____

This story, supported by the Pulitzer Center for Crisis Reporting, is part of an ongoing Associated Press series, “Tracked,” that investigates the power and consequences of decisions driven by algorithms on people’s everyday lives.

Read full report

A 2019 survey revealed that 82% of Americans believed that A.I. was more harmful than helpful. Nevertheless, earlier this summer a federal agency asked for public comments on how much Americans trust it. Maybe this is because President Biden recently committed to incorporating more A.I. into our lives. Argh.


Originally published by BN Frank at Activist Post.

2 thoughts on “A.I. “ShotSpotter” Conviction Overturned Due to “Scant Evidence” After Man Spends Almost One Year in Prison

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version