The Mystery of AI Gunshot-Detection Accuracy Is Finally Unraveling

2 months ago 23

Liz González’s neighborhood in East San Jose can be loud. Some of her neighbors apparently want the whole block to hear their cars, others like to light fireworks for every occasion, and occasionally there are gunshots.

In February 2023, San Jose began piloting AI-powered gunshot detection technology from the company Flock Safety in several sections of the city, including Gonzalez’s neighborhood. During the first four months of the pilot, Flock’s gunshot detection system alerted police to 123 shooting incidents. But new data released by San Jose’s Digital Privacy Office shows that only 50 percent of those alerts were actually confirmed to be gunfire, while 34 percent of them were confirmed false positives, meaning the Flock Safety system incorrectly identified other sounds—such as fireworks, construction, or cars backfiring—as shooting incidents. After Flock recalibrated its sensors in July 2023, 81 percent of alerts were confirmed gunshots, 7 percent were false alarms, and 12 percent could not be determined one way or the other.

For two decades, cities around the country have used automated gunshot detection systems to quickly dispatch police to the scenes of suspected shootings. But reliable data about the accuracy of the systems and how frequently they raise false alarms has been difficult, if not impossible, for the public to find. San Jose, which has taken a leading role in defining responsible government use of AI systems, appears to be the only city that requires its police department to disclose accuracy data for its gunshot detection system. The report it released on May 31 marks the first time it has published that information.

The false-positive rate is of particular concern to communities of color, some of whom fear that gunshot detection systems are unnecessarily sending police into neighborhoods expecting gunfire. Nonwhite Americans are more often subjected to surveillance by the systems and are disproportionately killed in interactions with police. “For us, any interaction with police is a potentially dangerous one,” says Gonzalez, an organizer with Silicon Valley De-Bug, a community advocacy group based in San Jose.

San Jose did not attempt to quantify how many shooting incidents in the covered area the Flock System failed to detect, also known as the false-negative rate. However, the report says that “it is clear the system is not detecting all gunshots the department would expect.”

Flock Safety says its Raven gunshot detection system is 90 percent accurate. SoundThinking, which sells the ShotSpotter system, is the most popular gunshot detection technology on the market. It claims a 97 percent accuracy rate. But the data from San Jose and a handful of other communities that used the technologies suggest the systems—which use computer algorithms, and in SoundThinking’s case, human reviewers, to determine whether the sounds captured by their sensors are gunshots—may be less reliable than advertised.

Last year, journalists with CU-CitizensAccess obtained data from Champaign, Illinois, showing that only 8 percent of the 64 alerts generated by the city’s Raven system over a six-month period could be confirmed as gunfire. In 2021, the Chicago Office of Inspector General reported that over a 17-month period only 9 percent of the 41,830 alerts with dispositions that were generated by the city’s ShotSpotter system could be connected to evidence of a gun-related crime. SoundThinking has criticized the Chicago OIG report, saying it relied on “incomplete and irreconcilable data.”

Read Entire Article