Then-Attorney General Jeff Sessions, among others, claimed that giving military equipment to police would improve policing. When he announced that the Trump administration would repeal the Obama-era limits, he said that “studies have shown this equipment reduces crime rates … assaults against police officers, and … complaints against police officers.”
His summary of the research was accurate, but the research is based on flawed data. Here’s what happened.
A government program gave military equipment to police
Sessions relied on findings from two studies published in the same issue of the American Economic Journal: Economic Policy. Economists Vincenzo Bove and Evelina Gavrilova wrote that “military aid reduces street-level” crime and that the “display of military equipment” causes criminals to rethink committing crimes. Meanwhile, economists Matthew Harris, Jinseong Park, Donald Bruce and Matthew Murray concluded that when a police department had tactical military equipment, it reduced both complaints and assaults against police officers and enabled them to make more drug-related arrests.
These headline conclusions were based on evidence from the 1033 Program, which allows the Defense Logistics Agency to send police departments weapons and equipment. One local 1033 Program officer described the program to us as “Uncle Sam’s Goodwill Store.” Local and state police departments request military surplus (of unknown quality) as it becomes available — including items as varied as assault rifles, armored vehicles, computers and tube socks. All they have to pay for is shipping. Data on the program looks like long lists of shipments from the federal government to local and state police.
This is the data the economists who wrote these studies collected. They took the list of shipments, which described what was shipped and gave a date, then transformed it into a record of what police held over time. Their goal was to determine whether the presence of weapons and equipment reduced or increased crime.
Data from this program may have had real problems
Responsible scholars try to figure out possible sources of bias in whatever data they’re examining. That is why, for example, the authors of these studies tried to figure out whether police requests for equipment might be driven by crime. If we see a high crime area with lots of equipment, it might be because police made requests in response to crime, or it might be that the equipment itself contributed to crime. The studies used standard techniques to counter for this possible bias.
However, the data was incomplete. The data recorded what agencies had on hand in 2014, but did not accurately report when they received it or what they had before. Weapons and gear can be returned, destroyed or obtained from another local police department or state coordinator — and the data did not record any of that. The researchers couldn’t know what police had in the past, so any conclusions they drew were based on an inaccurate history.
Since then, the government has released a new inventory every three months. We have carefully examined changes in inventory records over the past five years, and learned that about 1 in 5 weapons and vehicles are returned or transferred within five years. That means that the techniques that the authors used aren’t in fact able to establish what is causing what, because their data is probably missing a lot of weapons and equipment. A department’s decision to give back, transfer or destroy an assault rifle, for example, might be related to drops or spikes in crime. We don’t even have enough information to know which. Not surprisingly, a research team at Emory University with better data could not replicate the results of these studies.
So what happens now?
Studies of police militarization illustrate the difficulties that scholars can run into when they don’t investigate the limits of the data they are using. Journalists at the New York Times, NPR, Chicago Tribune and The Washington Post first acquired the data. Seven of the 14 published or working papers about the effects of the program between 2015 and 2019 directly cite these sources, more or less using the data as if it was an accurate record. To date, these studies have been cited 135 times and received coverage in both the Economist and The Post, which is significant for studies this recent.
It’s hard to figure out the actual impact of the 1033 Program, because the federal government did not retain inventory records of what police departments had before 2014. To know what kinds of federal military equipment state and local police departments had before then, you have to ask police departments themselves — and many keep limited records. We will probably never know exactly what military equipment police had over the previous 15-year period, which makes it hard to learn about its impact.
We know this because we are part of a research team trying to reconstruct the 1033 Program’s record in Michigan. After accounting for the data problems, we find that equipment returns ordered in 2015 by the Obama administration had no detectable impact on violent crime or officer safety. Jonathan Mummolo similarly finds that SWAT teams did not reduce crime or enhance officer safety.
Scholars and journalists have strong incentives to write quickly on “hot” topics such as police militarization. However, speed can be the enemy of careful data collection and curation. After five years of research, we are still a long way from understanding the impact of giving local police weapons and equipment designed for military use.
Ayse Eldes is an undergraduate at the University of Michigan.
Kenneth Lowande is assistant professor of political science and public policy (by courtesy) at the University of Michigan.