Thinking Through the ShotSpotter Debate
Executive Summary
Recent years have seen significant activism against “gunshot detection technology,” or GDT—most prominently, the ShotSpotter product. This technology monitors neighborhoods for loud percussive sounds likely to be gunshots and—after a brief review process to limit false positives—alerts the police to the incidents and their locations. Opponents claim that the technology is inaccurate, racially biased (as sensors are disproportionately placed in minority neighborhoods), ineffective in helping police respond to crime, and simply not worth the cost.
The purpose of this report is to dispassionately assess the evidence regarding each of these criticisms of GDT. Key findings include:
- Racial bias: Sensors appear to be placed based on levels of gun violence—i.e., where they are most needed—though these areas do tend to be disproportionately minority.
- Accuracy: There are relatively few proven false alerts. However, police often fail to find actionable evidence of a shooting when responding.
- Effectiveness: ShotSpotter delivers on its promise of getting police to shooting scenes faster, identifying gunfire that otherwise would have gone unnoticed, and increasing evidence collection. But many studies are unable to measure increases in clearance rates or reductions in shootings in places where GDT is deployed.
- Costs: The direct costs of GDT are generally a tiny fraction of total police spending in big cities, and officers spend a relatively small share of their total time responding to alerts, though resource- and staff-constrained departments will feel these burdens most acutely.
The cost-benefit trade-off of GDT will vary from department to department, especially because some are better equipped than others to handle the additional police workload and comprehensively process new evidence. In addition, reasonable people may disagree about how to value the technology’s proven benefits to investigations in light of unclear effects on clearance and crime rates. One sensible approach is for departments to focus primarily on hiring adequate staff to respond to calls and to create a strong infrastructure to support investigations. Departments can then explore whether the additional information provided by GDT is worth the costs.
Introduction
Gunshot detection technology (GDT), such as ShotSpotter, has sparked a complicated debate that has played out in academic studies, police department reports, and media articles. This report will offer an overview of how GDT works, common criticisms of the technology, and the academic literature on its effects. Evidence shows that:
- ShotSpotter sensors appear to be deployed in areas of cities with the most gun violence, though these areas do tend to be disproportionately minority.
- Few ShotSpotter alerts are demonstrably incorrect. However, many alerts are unproductive—police find no evidence of a gun being discharged, regardless of whether one was discharged. Departments should know that they will likely need to send officers to numerous GDT calls to produce evidence of one shooting.
- ShotSpotter improves investigations. It leads to the collection of shell casings and abandoned guns that would not otherwise have been found, and it gets police to shooting scenes more quickly than 911 calls do—though, notably, the most serious ShotSpotter alerts are often accompanied by a 911 call. In some cities, the technology contributes to a broader infrastructure for analyzing evidence and addressing gun violence, such as Crime Gun Intelligence Centers and Real-Time Crime Centers.
- ShotSpotter has direct costs ($65,000–$90,000 per square mile annually), as well as opportunity costs associated with sending officers to alerts. These costs tend to be a small percentage of overall police spending and officer time, but cities with funding and staffing constraints will feel them most acutely. When staffing is scarce, the correct prioritization of ShotSpotter alerts, relative to other urgent calls, is especially important.
- Even a very small, difficult-to-measure impact on homicides and other serious crimes—whether through deterrence, incapacitation by arrest, or faster medical treatment—might justify ShotSpotter’s expense in cities with high rates of gun violence, assuming standard estimates of the societal costs of these crimes.
- Despite ShotSpotter’s investigative benefits, most studies fail to find concrete crime reductions or clearance-rate increases. Many of these studies have wide margins of error, however, and thus cannot rule out small benefits that might justify the system’s cost. Further, cities vary immensely in their approaches to the technology, meaning that discouraging results in one place do not necessarily bode ill for another place.
Many common complaints about ShotSpotter are not well-founded, but the system’s trade-offs may play out differently in different places—depending, especially, on departments’ ability to respond to alerts and to quickly process ballistic evidence.
ShotSpotter is not a panacea for gun violence or a racist ploy to surveil minority communities. It is, instead, a tool for improving gunshot response at the margin, which may be a worthwhile investment for departments that can make the most of it. Before adopting or continuing to use the technology, however, departments should assess how it fits into a broader strategy for investigating gun crime and whether they have the resources to respond to additional ShotSpotter-generated gunshot calls. For departments that lack the staffing or infrastructure to promptly respond to those calls and conduct thorough investigations, addressing those shortcomings should be the main priority.
The Practice and Promise of GDT
ShotSpotter, produced by the company SoundThinking, is the most prominent brand of GDT. Other brands, such as Flock, offer conceptually similar products.
A ShotSpotter system involves several sensors installed in a coverage area chosen by the purchasing city, ranging from a few square miles to vast swathes of land. Microphones monitor for loud percussive sounds, and an algorithm assesses whether they are likely gunfire. By comparing the time at which the sound reaches different microphones, the system can determine where it came from, purportedly within an 80-foot radius.[1] When a gunshot is detected, human reviewers double-check the audio before “publishing” the alert, including the location and audio, to the police, typically within one minute of the sound being made.
Published alerts are sent directly to officers via a mobile app so that they can respond immediately when available,[2] though dispatchers are also involved in getting police to the scene and prioritizing the alerts amid other calls. The technology cannot reliably detect shots fired indoors, from suppressed weapons, or from firearms that are .25 caliber or smaller.[3] (Among crime handguns traced nationwide, only about 5% are that small. Rifles account for only 10% of traced guns, but about 25% of traced rifles are .22 caliber.)[4] To address privacy concerns, only gunshot audio is preserved, while the rest of the audio is purged every 30 hours. The risk of the microphones picking up street conversations appears minimal.[5]
The police response upon arriving at a GDT alert varies across cities. Researchers Daniel S. Lawrence and Kenneth J. Novak have categorized agency responses into three tiers. In Tier 1, often in agencies with high crime and low police staffing, “officers merely respond to the scene and conduct cursory inspections for gunshot victims, potential suspects with guns, or individuals fleeing the area.”[6] Tier 2 responses involve cops leaving their cars, thoroughly canvasing the scene for evidence, and leaving contact cards and door hangers for potential witnesses. (Some departments even use trained dogs to locate casings.)[7]
In Tier 3, departments place a high priority on using ShotSpotter in conjunction with other technologies. For example, departments can utilize the National Integrated Ballistic Information Network (NIBIN), which is able to match shell casings fired by the same gun, connecting crimes to one another. Recovered firearms can also be traced to their most recent retail purchase (through eTrace) and test-fired to link them to NIBIN data. These efforts are a specialty of Crime Gun Intelligence Centers, which focus on collecting and rapidly analyzing ballistic evidence. These centers—often operated jointly by a police department and another agency such as the federal Bureau of Alcohol, Tobacco, Firearms and Explosives—have proliferated over the past decade, with early evidence suggesting that they effectively boost clearance rates.[8] In some cities, including New York, GDT responses can involve cutting-edge technology such as drones, which can arrive quickly, take video, and even follow fleeing suspects.[9]
In theory, GDT could prove beneficial in several ways. It could deter shooters from firing their weapons in areas known to be covered by the technology. The individual sensors tend to be hidden, but when cities adopt GDT, news organizations and other sources often report that GDT will be used and in which neighborhoods.[10] It could save the lives of gunshot victims by facilitating a faster emergency response. It could help police find fleeing shooters or useful evidence to inform detectives’ investigations,[11] increasing the likelihood that a murderer is arrested, charged, and incapacitated. ShotSpotter data and audio have also been used as evidence in court.[12]
Users of GDT have offered numerous anecdotes about how the technology has helped them respond to gun violence. Here is how the technology worked in one incident in Durham, North Carolina:
On July 25, a shooting in the target area (Colfax and Linwood) resulted in several SS [ShotSpotter] notifications. Officers arrived at the scene less than four minutes after the shooting, and found a victim with life-threatening injuries. They administered first aid to stop the bleeding, and the victim survived. In this case there was a 911 call received 47 seconds after the first SS alert. It is plausible, though uncertain, that the quicker response enabled by ShotSpotter saved the victim’s life.[13]
Similarly, in Columbus, Ohio, after a two-year-old inside a house was struck by gunfire, ShotSpotter notified the police before 911 calls arrived. Again, the faster response may have helped save a victim’s life.[14] And here is an example from Houston:
On March 13, 2021, officers were notified by ShotSpotter of a shooting that occurred at 6812 Eastwood. Upon arrival, the Complainant was unresponsive with a gunshot wound to the head and the suspect was [gone on arrival]. Officers put out a [“be on the lookout” notification,] and the suspect was found [roughly half a mile away] by a deputy. Suspect was arrested and charged with Murder.[15]
A more controversial anecdote involving GDT is the police killing of 13-year-old Adam Toledo in Chicago. A little after 2:30 am on March 29, 2021, both ShotSpotter and a 911 call alerted police to multiple shots fired. Arriving only a few minutes later, police found Toledo and an older friend, who was later prosecuted, but not convicted, for the initial shooting.[16] One officer chased each. In an alley, Toledo slowed down and turned to face the officer pursuing him, throwing a gun behind a fence and raising his hands. Less than a second after Toledo had been clearly holding a gun, the officer shot him; Toledo died despite receiving immediate medical attention.[17] The officer was not charged criminally,[18] though police officials did pursue disciplinary action for rules violations.[19]
The case is a tragic example of what can happen when police respond quickly to gunshots. But for the purposes of the GDT debate, it is important to remember that the technology provides only information; it does not dictate how police respond.
Approximately 170 cities currently use ShotSpotter. However, several cities have recently abandoned GDT, some because of political pressure and others after concluding that the costs of the system outweighed its benefits.[20] Perhaps most famously, Chicago stopped using ShotSpotter in late 2024, fulfilling a campaign promise made by Mayor Brandon Johnson—despite efforts to save the technology, including an offer to help pay for the system from aldermen and business leaders shortly after it was deactivated.[21] Other cities that have ceased using GDT after adopting it (or participating in a pilot program) include Atlanta,[22] Indianapolis,[23] San Antonio,[24] Durham,[25] Winston-Salem,[26] and Charlotte.[27]
Race and GDT Coverage
A common criticism of GDT is that it is racist, with sensors placed disproportionately in minority neighborhoods.[28]
As noted above, while the neighborhoods covered by ShotSpotter are often publicly reported, the exact sensor locations are not public information. However, in early 2024, an anonymous source leaked the locations of more than 25,000 ShotSpotter sensors to Wired magazine. Analyzing the data, the magazine’s Dhruv Mehrotra and Joey Scott concluded that “nearly 70 percent of people who live in a [census block group] with at least one SoundThinking sensor identified in the [census bureau’s American Community Survey] as either Black or Latine” and that “nearly three-quarters of these neighborhoods are majority nonwhite.”[29]
This raises an important question: Does race play a role in how police departments choose their coverage areas? Or are the sensors simply placed in the areas with the most shootings, with no consideration of racial demographics?
Since rates of gun violence differ by race, an entirely race-neutral placement of sensors could, on average, place far more sensors in minority neighborhoods, particularly black neighborhoods. For instance, according to data from the Centers for Disease Control, between 2018 and 2023, non-Hispanic black Americans died of gun homicides at an annual rate of 25.2 per 100,000, 13 times the rate for non-Hispanic whites.[30] For Hispanics, the rate was 4.7 per 100,000—a less drastic difference but still more than twice that of non-Hispanic whites.[31] In other words, racial disparity need not imply racial bias.
Several analyses support this intuition. Comparing police enforcement actions following 911 calls and gunshot-detection alerts in Chicago, Eric Piza and his coauthors found similar racial disparities across the two.[32] In an otherwise critical evaluation of ShotSpotter, New York comptroller Brad Lander found that “sensors were generally placed in areas with the highest numbers of confirmed shootings,” although the order in which precincts received coverage did not always precisely track their rankings by raw number of shootings.[33] An assessment in St. Louis County from the Policing Project at the NYU School of Law “did not find any evidence that the technology exacerbated racial disparities in arrests,” though this project received (unrestricted) funding from ShotSpotter.[34]
To further investigate this question, I combined data from three sources: 1) gun-violence incidents identified by latitude and longitude in the Gun Violence Archive (GVA)[35] during 2014–22, removing self-inflicted gunshots (which tend to occur indoors, are not crimes, and are not well covered in the data set) as well as officer-involved incidents, to the extent feasible based on the database’s coding scheme;[36] 2) the Wired ShotSpotter data, which the lead author provided to me in a censored format to protect the privacy of those who allowed sensors on their property; and 3) black population shares that I calculated from the census bureau’s 2018–22 five-year American Community Survey data, which was also referenced in the Wired article.
Figure 1 depicts the census block groups that fall entirely within Chicago, New York, and Miami (the three cities with the largest numbers of sensors in the Wired data) and that have available demographic data. Some areas within the city limits, especially those near city borders or unpopulated—including Central Park and O’Hare Airport—are thus carved out.
The first column depicts known gun-violence incidents as dots on the map. The second column identifies block groups with at least one sensor, which broadly illustrates each city’s coverage area and is similar to other sources.[37] In the third column, block groups are shaded according to their black share of the population. Here, a 100% black area will appear the same shade as a group with sensors in the second column.

A visual comparison of these maps makes clear that ShotSpotter sensor placement indeed closely tracks gun-violence concentrations and that black population share overlaps substantially with both. Still, important caveats make this method illustrative rather than definitive and limit the feasibility of more detailed statistical tests, though that is a promising avenue for future research.
For example, the sensors’ coverage areas do not perfectly correspond to the borders of the block groups in which they are located: a large or odd-shaped block group may contain sensors but not be fully covered by the system, for example, while a smaller one with no sensors may be covered, in whole or in part, by sensors located nearby.[38] Further, although the first two columns appear quite similar, one would not expect a one-to-one perfect match between GVA-recorded gun violence in this particular period and sensors at the block-group level. When determining sensor placement, police departments likely consider other geographies (such as precincts or districts) and rely on data sets with varying time periods and collection methods, depending on when they originally adopted the technology and how they keep statistics on gun violence internally.
One way of thinking about the issue is that, since gun violence is concentrated among black Americans and in heavily black neighborhoods, the stakes of gun-violence prevention are higher for the black community. Successful violence reduction will disproportionately benefit African Americans, but ineffective or abusive efforts concentrated in high-violence neighborhoods will disproportionately harm them, too. Perhaps the real question, therefore, is whether the technology helps or hurts.
The Information That ShotSpotter Provides
Other criticisms of ShotSpotter instead focus on the system’s basic accuracy. Here, it is important to note that accuracy can be measured in several different ways, such as the percentage of known outdoor gunshots that were detected or the share of alerts that were proved not to be gunshots. Alternatively, one could deliberately test the system by firing blanks within range of the sensors.
According to Lawrence and Novak, “numerous studies have consistently affirmed the technology’s accuracy, ranging from it being able to accurately identify 70 percent of test gunfire in St. Louis (Mares, 2021), 81 percent in Redwood, CA (Mazerolle et al., 2000; Watkins et al., 2002), 90 percent in Las Vegas, NV (Koren, 2018), and over 99 percent in Charleston, SC (Goode, 2012; NIJ, 2007).”[39] A ShotSpotter-commissioned analysis, based on clients’ reports of known errors, finds accuracy above 97%.[40]
However, evaluating the real-life performance of the system is complicated by the fact that, in many cases, gunshot detections produce no evidence of gunfire beyond the audio itself—but also no proof that something else made the noise. If a shooter uses a revolver (which accounts for about 7% of traced crime guns nationally)[41] or picks up his shell casings, or if officers simply fail to find any casings, bullet holes, or witnesses, a perfectly accurate alert may lead nowhere. As a result, in many cases, we simply do not know with certainty what made the sounds in question.
Reported accuracy varies by city; but assessments generally show that police find hard evidence of a gun discharge in fewer than half of all GDT responses.
- In New York, Brad Lander’s 2024 report on the system noted that, out of 940 alerts in June 2023, 47 were “unfounded”and 771 “unconfirmed,”while 122, or 13%, stemmed from confirmed shootings. Put differently, NYPD officers had to respond to about seven ShotSpotter calls to find evidence of one shooting. In addition, the report found that ShotSpotter generally met its contractual obligation not to miss more than 10% of known gunfire incidents—despite particular struggles in Manhattan,[42] where, according to the company, noise, construction, and building density made gunshot detection more difficult.
- A 2021 report from Chicago’s inspector general estimated that, of ShotSpotter alerts that the city’s police responded to in 2020 and early 2021, about 9% led to “evidence of a gun-related criminal offense,” or about 1 out of every 11.[43] A more recent South Side Weekly analysis of city data found that ShotSpotter missed about 20% of outdoor gunshots that should have been in range of its sensors but could not determine to what extent this was due to shots that were suppressed or too small-caliber for detection.[44]
- A 2024 analysis of Winston-Salem ShotSpotter alerts dating back to August 2021 found that “1,614 (43.3%) alerts produced distinct evidence of gunfire, 1,995 (53.5%) did not yield conclusive evidence of gunfire, and only 40 (1%) cases turned out to be confirmed false positives.”[45]
- In Durham’s pilot program,“over half of all gunshot notifications in the target area in 2023 were SS alerts in which there was no 911 call, and in 91% of those alerts the responding officers did not find evidence of a crime in which a gun was fired.”[46]
- When Indianapolis piloted both ShotSpotter- and Flock-brand GDT systems in 2022, the final report found an evidence recovery rate of 8.2%.[47]
- In Houston, from late 2020 to late 2021, about 20% of ShotSpotter alerts resulted in an offense report.[48]
Importantly, GDT systems alert police to many incidents that do not otherwise prompt calls from residents, providing entirely new information. An influential 2016 study using ShotSpotter data found that “only 12% of gunfire incidents result in a 911 call to report gunshots.”[49]
It’s hard to argue with the proposition that police should know about gunshots—or, at least, noises indistinguishable from gunshots—that occur within their jurisdictions so that they can respond appropriately and, at a minimum, keep track of the information. But it is worth fleshing out which types of incidents are most likely to result in 911 calls and which are more likely to lead only to ShotSpotter alerts. When a 911 call accompanies a ShotSpotter alert, the latter no longer provides purely unique information, though it may still provide faster notice and a more accurate location.
Data published by the Chicago Police Department[50] indicate that, from the beginning of 2024 through September 22 of that year (when the system was discontinued), more than 33,000 ShotSpotter events occurred. Only about 25% of these had a corresponding call to 911. However, of the 160 cases in which officers rendered aid to a gunshot victim following a ShotSpotter alert, only eight were incidents with no 911 call. Similarly, approximately 400 out of 500 recovered firearms, 400 out of 500 arrests, and 27,000 out of 33,000 recovered shell casings in the wake of ShotSpotter alerts came from incidents that had also been called in.
Most of the gun-crime evidence collected by Chicago Police, therefore, came from incidents with a call; but notably, a significant fraction came from incidents without a call. Relatedly, since the technology was discontinued, CWB Chicago has extensively reported on “incidents of people being found shot in areas previously served by ShotSpotter where the technology, had it not been dismantled, could have played a critical, helpful role,” including “people being found shot without corresponding 911 calls of shots fired and people being found shot in areas where 911 callers provided inaccurate or overly broad locations of gunfire.” It had tallied 55 such incidents as of this writing.[51]
Consistent with the patterns seen in Chicago, the above-noted analysis of the Winston-Salem data concluded that “the number of rounds fired during an incident and whether the incident was connected to a violent crime (assault, homicide or robbery) are key predictors of an alert also receiving a call from residents.”[52] The Durham assessment found that while 57% of gunshot reports in the covered area came from ShotSpotter alone (as opposed to a 911 call or both sources), this was true of just 31% of confirmed shootings without gunshot wounds—and 4% of shootings with gunshot wounds.[53]
Few ShotSpotter alerts are proven to be entirely unfounded, and the system informs departments about many apparent gunshots of which they would not otherwise be aware. However, these marginal gunshot reports are, disproportionately, less serious and less likely to produce actionable evidence than those accompanied by 911 calls in many cities.
Departments might reasonably take this pattern into account when prioritizing ShotSpotter calls that conflict with other emergencies. Officers should also be trained to understand the low chance of finding evidence when responding to a GDT call, which will help to assuage the concern—both from a community-relations and a legal standpoint—that officers will act aggressively toward anyone found in the general vicinity of an alert.
Importantly, GDT may discourage residents from calling the police when they hear gunshots—a sort of “let ShotSpotter handle it” phenomenon. Decreases in 911 gunshot calls have been observed, for example, in Kansas City[54] and St. Louis.[55]
How ShotSpotter Affects Investigations and Outcomes
A common finding across numerous cities is that GDT can direct police to the scene more quickly than the 911 system.[56]
In the Chicago data discussed above, for cases with a ShotSpotter alert but no 911 call, it took police about 12 minutes to arrive from the time of the alert, versus about 14.5 minutes in cases with a call but no ShotSpotter data. Likewise, the Durham analysis suggested that “in 2023, the median response time (from alert to arrival at the scene) was 5.5 minutes, which reflected a decline by 1.2 minutes in the target area compared with the rest of the city. There was a still greater improvement in the 90th percentile [i.e., slower] of response times, which declined by 3.6 minutes.”[57] An analysis of data from Milwaukee, Denver, and Richmond, CA, found a nearly one-minute improvement in response time that stemmed mainly from “the duration between the notification and the assignment of an officer, rather than the actual time taken by the officer to arrive at the scene after being assigned to the event.”[58] Indianapolis experienced an especially large difference, with a 6.3-minute response time for GDT calls versus 13 minutes for shots fired after calls for service.[59]
GDT improves police response by specifying the exact location of an incident. A study using detailed data from two departments finds that “police officers stop their vehicles more often and closer to the detected/reported crime scene on GDT alerts than [calls for service] for all crime types across both Chicago and Kansas City.”[60]
On the margin, quicker response logically implies a greater likelihood of finding victims in need of help, witnesses, or even fleeing perpetrators, all of which become scarcer as time passes. Empirically, some studies[61] have found increases in firearm recoveries or NIBIN leads from the use of GDT. According to a RAND study of Chicago’s Strategic Decision Support Centers (elsewhere known as Real-Time Crime Centers), “personnel indicated that by directly monitoring ShotSpotter alerts at the SDSC (rather than at the citywide Office of Emergency Management Center), they are able to quickly get cameras trained on the area to begin gathering video evidence.”[62] Some evidence shows that faster emergency treatment marginally improves gunshot survival,[63] that video evidence improves clearance rates,[64] and that NIBIN leads can help solve cases, especially when processed quickly.[65]
It is not surprising that faster response and additional ballistic evidence improve clearance rates, save injured victims, and reduce crime, to some extent. But does GDT meaningfully affect these key outcomes in a directly measurable way? On that question, the hard evidence is decidedly mixed.
Most, though not all, studies do not find that GDT leads to statistically measurable improvements in clearance and crime rates—but many of these studies are not precise enough to find small effects (a point to which we shall return). Examples:
- In a recent series of studies funded by a federal grant, Eric L. Piza and coauthors analyzed ShotSpotter in Chicago and Kansas City, looking to see what changed when areas were newly covered by the system, relative to trends in comparable, noncovered areas. Despite guiding officers to gunshot locations faster and boosting evidence recovery, the system did not produce measurable improvements in case clearance or reductions in gun crime.[66] One of these studies used Chicago crime data during 2008–19 (though the large-scale ShotSpotter rollout began in 2017). The large data set allowed for some of the more precise estimates in the literature: the 14 districts where the technology was used each experienced, on average, between 0.7 fewer and 2.5 more fatal shootings in total, relative to what one would expect from the control group.[67]
- A study of St. Louis by Dennis Mares and Emily Blackburn found that the system did “not significantly reduce violent crime levels in any of the study periods.”[68]
- By contrast, a study of Winston-Salem from Mares and Andrew Buettner concluded that the “ShotSpotter area saw a significant 24% reduction in assaults and homicides,” including a decrease of about 87 aggravated assaults.[69]
- A study covering Denver, Milwaukee, and Richmond, CA, from Daniel Lawrence and two coauthors, found that while “GDT is generally but not consistently associated with faster response times and more evidence collection,” the crime effects were “more uneven but generally cost-beneficial.”[70]
- Studies from Camden, NJ, found that ShotSpotter reduced response and transport time for gunshot-wound victims.[71] The authors were unable to measure whether there was a reduction in mortality.
Many questions are still unresolved. For example, might the small, statistically insignificant effects found in many studies add up to something more definitive when they are combined? And what explains the differences across cities, even in studies that shared an author, such as the large effect in Winston-Salem versus the disappointing results in St. Louis? These differences could reflect methodological differences, statistical flukes, and effective vs. ineffective uses of the technology.
Especially deserving of further study is how GDT interacts with a department’s staffing level and broader strategy. Possibly, ShotSpotter is more effective when departments have the time and staff to thoroughly investigate reports, for example, or when it is integrated into a broader, dedicated infrastructure for addressing gun crime that can make use of the evidence collected.
Strong evidence shows that better police staffing in itself reduces crime,[72] and a growing body of evidence supports efforts such as Crime Gun Intelligence Centers (which focus on analyzing ballistic evidence and connecting crimes to one another) and Real-Time Crime Centers[73] (which provide live support to police using various forms of technology, including surveillance cameras and GDT alerts). A reasonable approach for departments might be to focus on staffing and infrastructure first—and then to ask whether the additional information provided by ShotSpotter is worth the cost.
“If It Saves Just One Life”: The Cost-Benefit Question
It’s a cliché, and a fallacy, to assert that an effort must be worthwhile “if it saves just one life.” The world is full of trade-offs that involve the risk of death, from highway driving to unhealthy eating, and we must make those trade-offs by weighing the costs and benefits, rather than resolve every one in favor of safety.
How can policymakers compare lives saved with dollars spent in an objective manner? By placing a value on a human life. The Department of Transportation (DOT), for example, currently recommends a “value of a statistical life” (VSL) of almost $14 million in cost-benefit analyses of infrastructure projects that could, for instance, affect traffic deaths,[74] based on studies that estimate individuals’“willingness to pay” for a lower chance of dying.[75]
Researchers have estimated the costs of other types of crime as well. For example, the above-noted analysis of Winston-Salem, which found that GDT led to about 87 fewer aggravated assaults, estimated that each such assault imposes costs of roughly $100,000 and thus that the benefits of ShotSpotter easily outweighed the approximate $350,000 in costs (including the $205,000 contract, plus wear and tear on vehicles, investigation costs, etc.).[76] Nonetheless, officials ultimately decided not to continue ShotSpotter after the program’s federal grant expired, noting that the system covered only a small part of the city and many alerts did not generate evidence.[77]
Crime-cost estimates, however, can go only so far. There is considerable variation in the estimates themselves and debates about the appropriate methods,[78] and it is surely unreasonable to insist that every government, at every level, must fund any policy or program that is likely to save more than one life for every $14 million spent, especially as the governments themselves will recoup very little of this benefit (and thus will need to raise taxes to pay for it, which itself will have downstream effects).[79] There are also uncomfortable questions about whether to value all lives equally: a metric known as “quality-adjusted life years”[80] places more value on the lives of the young, for instance. VSL estimates, furthermore, can be far lower for those with low incomes[81] and those who deliberately choose dangerous occupations—such as Army enlistees[82] and, more relevant to the context of street gun violence, gang members.[83] On the other hand, stopping one crime today may contribute to a safer society in the longer term, and these dynamic effects would be difficult to incorporate.
When it comes to spending on core government functions such as transportation infrastructure and policing, however, DOT’s VSL estimate can at least serve as a useful anchor. And in the context of ShotSpotter, that estimate highlights a problem with many studies that estimate crime effects. Even the largest American cities that have used the technology—Chicago and New York—have spent only about $10 million per year on recent ShotSpotter contracts, suggesting that it might take only a tiny reduction in homicides, perhaps even a single homicide per year, to make the system worthwhile.
But it is difficult to quantify the effect of ShotSpotter so precisely, and many studies do not come close. Cook and Soliman’s evaluation of ShotSpotter in Durham forthrightly declined to evaluate the system’s impact on gunshot injuries, given the difficulty of separating ShotSpotter’s impact from the “large natural variation in rates of gun violence.” It noted that a 10% decline would be a remarkable achievement—but it would be undetectable statistically. It further pointed out that a previous study (which used county-level data, even though GDT is typically deployed only in parts of a city) similarly failed to rule out sizable effects.[84] An analysis of Kansas City, meanwhile, put the technology’s impact at somewhere between −23% and +30% for fatal shootings, and between −26% and +15% for nonfatal shootings, after accounting for the confidence intervals.[85]
The GDT issue would be simpler if studies produced clear results—such as definite reductions in shootings and increases in clearance rates; precisely zero impact on crime and clearance rates regardless of how it is implemented; or a complete absence of benefits, including at the investigatory stage. Instead, on the whole, studies suggest modest investigative benefits and effects on crime and clearance rates that are often too small to measure. These effects could range from nonexistent to large enough to “cover” the system’s cost and likely vary widely, depending on how the tech is used.
Whether to implement GDT is ultimately a judgment call for cities and police departments. It depends on how highly a city prioritizes responding to gun violence and on how confident it is that its police department can translate the technology’s proven investigative benefits into crime and clearance benefits, even if such effects may prove hard to measure.
It is helpful to look more broadly at how much cities are already spending on policing and how much officer time is consumed by ShotSpotter’s opportunity costs. Do GDT’s marginal investigative benefits come at a reasonable marginal cost? Or is GDT a major expenditure that could seriously erode other priorities?
The direct cost of ShotSpotter is about $65,000–$90,000 per square mile per year.[86] This is certainly a lot of money compared with a typical household budget, but it is quite small in the context of urban police department budgets. Think again of Chicago, which had a particularly large installation spanning more than 100 square miles: it was spending about $9 million per year on ShotSpotter before ending its contract,[87] while the department’s official budget was $1.9 billion in 2022,[88] amounting to $8 million for each of Chicago’s roughly 230 square miles.[89] I.e., the city was spending well under 1% of what it spends on policing on ShotSpotter. The expenditure amounted to a bit over $3 per Chicago resident.
Table 1 provides cost estimates for a convenience sample of other big cities with recent, readily available numbers; some of these cities have opted not to continue ShotSpotter. In all cases, ShotSpotter contracts accounted for less than 1% of the police budget.
Table 1
ShotSpotter Contracts vs. Total Police Spending in Selected Cities
| City | Annual ShotSpotter Cost | Police Budget | Percentage | Per Resident |
| New York | $7 million[90] | $6 billion[91] | <1% | $1 |
| Detroit | $2 million[92] | $400 million[93] | <1% | $3 |
| Louisville | $800,000[94] | $200 million[95] | <1% | $1 |
| Columbus | $800,000[96] | $400 million[97] | <1% | $1 |
| Oakland | $800,000[98] | $300 million[99] | <1% | $2 |
| Fresno | $900,000[100] | $300 million[101] | <1% | $2 |
| Houston | $700,000[102] | $900 million[103] | <1% | <$1 |
| Winston-Salem | $200,000[104] | $100 million[105] | <1% | $1 |
Note: Numbers rounded
As is always the case when comparing data cobbled together across numerous cities and various sources, a few caveats are in order. Cities often experience controversies over what is included in the police budget,[106] and GDT contracts often cover several years or address existing and expansion areas separately, so the numbers, often drawn from media reports, should be seen as ballpark figures and have been rounded to a single digit. Further, small cities with low rates of gun violence will obviously experience different trade-offs, and even large cities will experience diminishing returns as they expand GDT coverage to safer areas. The Trace recently reported that some cities have purchased coverage despite suffering not even one shooting per month on average.[107]
Yet it seems fair to say that for larger urban departments, ShotSpotter coverage is a relatively small expense. Small expenses are not automatically worthwhile but are easier to justify.
A useful comparison is the cost of adding a single police officer for a year, which is $100,000–$200,000 for large cities (including pay, benefits, training, equipment, etc.).[108] Departments might consider, roughly speaking, whether one to three square miles of GDT coverage is at least as valuable as adding one officer, which has been estimated to abate 0.1 homicides on average.[109]
Also notable are recent efforts to develop far lower-cost versions of GDT, which, if they can someday match current commercial options in terms of effectiveness, could make the trade-offs easier. A 2019 paper even demonstrated GDT with a Raspberry Pi 3 Model B+—a small computer that today costs about $40[110]—“with a short message service modem and a USB microphone attached,” which inspired a recent trial, with mixed results, in Phoenix.[111]
Beyond the direct costs of the technology are opportunity costs associated with sending police to GDT alerts when they could be doing something else. There can also be additional direct costs if, for example, a department hires more staff to deal with the added workload[112] or incurs additional wear and tear on vehicles and equipment.[113] In New York, Lander estimated that, in a single month, police spent 426.9 hours responding to unfounded and unconfirmed alerts, even assuming that only one officer responded to each alert.[114] He called this a “significant waste of officer hours” with “fiscal consequences which the City can ill-afford.” But one might take the math one step further: spread across the NYPD’s 34,000 officers, that averages out to only about a minute per month, per officer. The analysis of Durham’s trial found that the opportunity cost “was not large in a proportional sense,” since Priority 2 calls, which is how the city treated ShotSpotter alerts, increased only about 2% citywide.[115]
The opportunity cost of responding to GDT alerts was likely greater in Chicago, which has just one-third of New York’s population[116] and fewer than 12,000 officers[117] but significantly more lethal violence.[118] One study[119] reported “approximately 70 ShotSpotter-related dispatches each day, equating to 75 hours of officer investigation time” in the city. These dispatches were treated as Priority 1, though the time spent works out to roughly 12 minutes per officer over the course of a 30-day month. The authors reported that Chicago districts saw, on average, 73 top-priority 911 dispatches and just three ShotSpotter alerts per day, though the former varied immensely (running as high as 223) while the latter doubled when considering only districts that actually had ShotSpotter installed.
These numbers suggest that, even in Chicago, ShotSpotter alerts cost relatively little officer time and only moderately increased Priority 1 dispatches. Nonetheless, by analyzing how responses changed as ShotSpotter rolled out in some parts of the city but not others, the study found that ShotSpotter significantly increased response times to 911 calls that are Priority 1, particularly when few officers were available or ShotSpotter alerts were most frequent. Overall, when responding to these 911 calls in ShotSpotter-covered areas, “officers are dispatched to calls slower (22%), arrive on-scene later (13%), and the probability of arrest is decreased 9%,” the study found.
While SoundThinking encourages “a high-priority response with at least two officers,” especially when large numbers of rounds are fired,[120] prioritization ultimately lies within the discretion of police departments. The Durham evaluation suggested that ShotSpotter alerts with only one or two shots and not accompanied by a 911 call could be treated as Priority 3 instead of Priority 2 (which, in Durham, is used for 911 gunshot calls and entails a two-car response). Columbus treats ShotSpotter alerts as Priority 2 but upgrades them to Priority 1, used for the most pressing emergencies, when accompanied by a 911 call reporting that someone was shot.[121] However, if officers do not prioritize an alert until there has also been a 911 call, they are giving up the time that elapses between the two.
Notably, critics of contemporary policing often argue that police do not spend enough time addressing violent crime and solving homicides, as opposed to handling more routine calls or patrolling neighborhoods. In her book about America’s homicide clearance problem, Ghettocide, crime journalist Jill Leovy wrote that “our criminal justice system harasses people on small pretexts but is exposed as a coward before murder.”[122] In the far more radical Defund, Black Lives Matter activist Sandy Hudson pointed to the low share of officer time spent on violence as a reason to reduce policing in general.[123]
By one estimate, homicides inflict two-thirds of the total cost of crime in the U.S.—despite being far rarer than, say, burglaries.[124] By another estimate, police in several cities with available data spend only about 4% of their time addressing violent crime.[125] Redirecting police effort toward gun violence is not necessarily a bad decision, even if each new investigative breakthrough or prevented crime consumes considerable effort.
Conclusion
ShotSpotter delivers on its promise of getting police to shooting scenes faster and boosting evidence recovery, but it increases officers’ workload and delivers crime benefits that are difficult to measure and may vary from place to place.
Nonetheless, solving one homicide is valuable, and preventing a homicide—whether by deterring gunplay, incapacitating a repeat offender, or saving a victim from bleeding out—is more valuable still. Even modest improvements in these outcomes could make GDT a worthwhile investment in dense cities with gun-violence problems, so long as they have the funds for the sensors, the staff to investigate alerts, and, ideally, a strong infrastructure such as a Crime Gun Intelligence Center and a Real-Time Crime Center to make use of the new information. Departments that lack the staffing or infrastructure to put ShotSpotter to use should focus primarily on developing those capabilities, which are important in and of themselves.
About the Author
Robert VerBruggen is a fellow at the Manhattan Institute, where he provides policy research, writes for City Journal, and contributes to special projects and initiatives in the President’s office. Having held roles as deputy managing editor of National Review, managing editor of the American Conservative, editor at RealClearPolicy, and assistant book editor at the Washington Times, VerBruggen writes on a wide array of issues, including economic policy, public finance, health care, education, family policy, cancel culture, and public safety. VerBruggen was a Philips Foundation Journalism Fellow in 2009 and a 2005 winner of the Chicago Headline Club Peter Lisagor Award. He holds a BA in journalism and political science from Northwestern University.
Endnotes
Photo by Jessica Rinaldi/The Boston Globe via Getty Images
Are you interested in supporting the Manhattan Institute’s public-interest research and journalism? As a 501(c)(3) nonprofit, donations in support of MI and its scholars’ work are fully tax-deductible as provided by law (EIN #13-2912529).