But a confidential ShotSpotter document obtained by The Associated Press outlines something the company doesn’t always tout about its “precision policing system” — that human employees can quickly overrule and reverse the algorithm’s determinations, and are given broad discretion to decide if a sound is a gunshot, fireworks, thunder or something else.
Marked “WARNING: CONFIDENTIAL,” the 19-page operations document spells out how employees in ShotSpotter’s review centers should listen to recordings and assess the algorithm’s finding of likely gunfire based upon a series of factors that may require judgment calls, including whether the sound has the cadence of gunfire, whether the audio pattern looks like “a sideways Christmas tree” and if there is “100% certainty of gunfire in reviewer’s mind.
Another part of the document underscores ShotSpotter’s longstanding emphasis on speed and decisiveness, and its commitment to classify sounds in less than a minute and alert local police and 911 dispatchers so they can send officers to the scene. Experts say such guidance under tight time pressure could encourage ShotSpotter reviewers to err in favor of categorizing a sound as a gunshot, even if some evidence for it falls short, potentially boosting the numbers of false positives.“You’re not giving your humans much time,” said Geoffrey Morrison, a voice-recognition scientist based in Britain who specializes in forensics processes. “And when humans are under great pressure, the possibility of mistakes is higher.
ShotSpotter installed its first sensors in Redwood City, California, in 1996, and for years relied solely on local 911 dispatchers and police to review each potential gunshot until adding its own human reviewers in 2011. As cities have weighed the system’s promise against its price tag -- which can reach $95,000 per square mile per year -- company employees have explained in detail how its acoustic sensors on utility poles and light posts pick up loud pops, booms or bangs and then filter the sounds through an algorithm that automatically classifies whether they’re gunfire or something else.
We need better regulations for AI and how it’s used.
like those stupid car exhausts
watch dogs shit
Experts say the extensive role of reviewers to backstop ShotSpotter’s gunshot-detection algorithm could bring in subjectivity and conflict with why artificial intelligence is often used in law-enforcement tools: to lessen the role of fallible humans.
Technology Technology Latest News, Technology Technology Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: YahooNews - 🏆 380. / 59 Read more »
Source: AP - 🏆 728. / 51 Read more »