“We’re starting to see police departments change what they do because of this new information and this new ability to sort of use this information in new ways.”
The “new information” is big data. The “new ways” in which this data is being used by law enforcement around the country is as varied as the police departments implementing them. Andrew Ferguson, the author of “The Rise of Big Data Policing: Surveillance, Race, and the Future of Law Enforcement,” says that as much as Google and Amazon collect data on their consumers, so do the police. “Different kinds of data [are] changing how police do their jobs, how they figure out what patrols they go on, who they target, and in many ways changing how they surveil citizens,” Ferguson told me in the latest episode of “Cape Up.” “It depends on which type of technology you’re talking about.”
We’re not just talking about the fixed cameras and the police body cameras that we’re accustomed to. Ferguson gets into the ways that big data is driving crime fighting. “There’s person-based predictive policing where they’re actually collecting information about people, about people they consider at risk, individuals they think will be most likely to commit an act of violence or be victims of violence,” Ferguson said. For fans of the 2002 Tom Cruise film “Minority Report,” this will sound like life catching up with art.
The Los Angeles Police Department “is partnered with Palantir, the same company that brought you the ability to track terrorists across the globe, is now starting to track gang members and sort of criminal associates across Greater LA,” explained Ferguson, a law professor at the University of the District of Columbia. They are doing it by going “into the community to find the ‘chronic offenders,’ individuals that they’ve already labelled as the people they believe are the most at risk for violence or recidivism.” Once found, Ferguson said, the officer fills out a field-interview card, which has “information about who are they with, where are they, what kind of car are they driving, who’s their girlfriend at the time, what tattoos they have.”
A similar and controversial kind of tracking being used in Chicago is the “strategic subject list,” a.k.a. the heat list. “The way you currently get on the Chicago heat list is the following variables,” Ferguson explained. “If you’ve been arrested for a violent crime, a narcotics crime or a weapons crime; if you’ve been the victim of a violent crime or a shooting, your age at the last arrest; younger the age, higher the risk score and kind of the trend line. Are you aging out, or is this sort of happening over and over again?” The algorithms that emerge from inputs that include arrests are problematic for the Chicago Police Department, which was found to be rife with “systemic endemic structural racism,” Ferguson pointed out, in a 2017 justice department report.
“We are literally, in Chicago right now, we’re giving people, human beings, threat scores, one to 500-plus, so that when you get pulled over . . . there’s literally a score on the dashboard computer,” Ferguson said. “So that the police officer knows like, ‘oh, you’re a 482,’ or whatever it would be. And you can imagine how that number, that high threat score is gonna impact how that officer treats this human being, no matter where they’re going.”
The LAPD is also using what Ferguson called “place-based policing,” a technique, he said, they pioneered with the help of a company called PredPol. “They basically took an algorithm that looked at whether a crime was essentially contagious. Certain kinds of crimes are almost viral in nature,” Ferguson explained. “Burglary, car theft, theft from auto. Usually there is an uptick in that particular crime in the near location after a crime.” The result is an algorithm that produces an action map handed to police officers at roll call.
“It’s a Google Map with little red boxes of the places where you are to patrol in your free time. Little 500-by-500-square-foot boxes where you are to go and hang out to deter and protect whatever it is at this moment when you have free time,” said Ferguson. But the implications of this information in the hands of cops are troubling. “So, in your mind as an officer, everyone in this box is probably a thief, right because the computer told you to be on the lookout at a particular time, particular location for a particular crime,” Ferguson pointed out. “Suddenly, you can just imagine how that distorts an officer’s vision of the community.”
Listen to the podcast to hear Ferguson discuss the good reasons why police departments have gravitated to predictive policing and the companies creating the algorithms that are at their foundation. He also talks about their problems, their impact on civil liberties and the need for their use to be subject to public review. And he says that all the data being collected could be put to even better use.
“If that individual is more at risk, maybe instead of sending a police officer to that door, you send an employer. You send a teacher and say, ‘Come back to school and let’s get back your education,’ ” Ferguson told me. “The idea of identifying risk of neighborhoods or individuals is fine, but the remedy could be community-wide. It can be social. It could be government-wide. We could send Social Service to the door. We don’t have to send police.”
Follow Jonathan on Twitter: @Capehartj
Subscribe to Cape Up, Jonathan Capehart’s weekly podcast