The Washington PostDemocracy Dies in Darkness

Opinion Coming soon to your local police department: Killer robots

Consultants with Flyspan Solutions demonstrate a drone intended for police use at the 2014 Drone Expo in Los Angeles. (Mark Ralston/AFP/Getty Images)

Imagine: You get pulled over for rolling through a stop sign, but instead of a police officer tapping your window, you see a drone hovering beside you. “Hold your license and registration up to the camera, and do not make any furtive movements,” says a voice.

“Furtive movements?” you think while eyeing the rectangular box on the drone’s underside, which you have heard contains an explosive charge large enough to kill you.

If you think that’s an inevitable future, it isn’t just because of movies like “RoboCop.” In the real world, police forces have enthusiastically embraced all manner of technological means of both surveillance and violence. These technologies are almost inevitably abused, and to believe police wouldn’t use such technology in troubling ways seems naive.

Which is why it’s so important that the San Francisco Board of Supervisors decided this week to authorize the city’s police to use robots to kill when they decide circumstances warrant it. There was one notable instance, in Dallas in 2016, when police strapped explosives to a robot and used it to kill a suspect. But now that San Francisco has an official policy on the technique, many more police departments are likely to develop their own policies — and to start obtaining equipment necessary to kill people remotely.

Some important distinctions are in order. The phrase “killer robots” usually refers to systems that can make autonomous decisions to kill, usually on a battlefield. This possibility has been intensely debated in military and human rights circles, and the Pentagon’s position remains that no weapons system should be permitted to make decisions to kill without human approval.

Follow Paul Waldman's opinionsFollow

Our military has become increasingly reliant on drones to kill from afar. But when it comes to policing, we still assume that deadly force is almost never planned, but is used only when a situation has spun out of control.

That’s one of the real dangers here: The easier you make it for cops to kill, the more apt they might be to do so.

Part of the problem is that unlike military personnel, cops controlling killer drones won’t be doing it as their primary job. They’ll likely be ordinary officers who get some special training but bring their own biases and reactions to their occasional assignment to control a killer drone. In the United States, that means police officers will likely be trained in the “warrior mindset” philosophy, which instills fear that their lives could be snuffed out at any moment and encourages deadly force as a response to that fear.

One might hope they won’t feel that fear, and might be less likely to kill, if they’re sitting in front of a computer screen. If their own lives aren’t at risk, they might not shout “He’s reaching for his waistband!” and start firing.

But they won’t easily be able to discard everything they’ve been taught about how to think and react in stressful situations. As more police departments deploy drones not just for surveillance but to interact with suspects — or protesters, or people acting erratically — it may be all too easy to press that red button on the joystick and use their exciting new equipment as it was designed.

San Francisco officials stress that they aren’t putting guns on drones. They’re just talking about attaching explosives to them, for something like a hostage situation. An SFPD spokesperson says such robots will be reserved for “extreme circumstances to save or prevent further loss of innocent lives.”

But it’s easy to see how the definition of “extreme circumstances” could become ever more elastic. You start off deploying your exploding drone to take down a terrorist strike team, but before long, you’re using it to serve warrants.

We’re also likely to see military systems cross over into police use, especially as this technology is advancing so rapidly. Police departments large and small eagerly scooped up military equipment offered to them by the Defense Department starting in the 1990s, and research showed that departments that militarized this way became more violent toward the communities they were supposed to protect.

President Barack Obama pulled back on those equipment transfers, but robots and drones get cheaper all the time, which means local police will be able to afford them without help from the federal government. For instance, an Israeli company recently released a weapons system based on small, nimble racing drones, which can autonomously zip through urban areas and locate enemies, then kill them with onboard explosives once an operator gives the go-ahead.

How long before police departments start looking for something similar, and companies step up to exploit that market?

The argument for military killer robots has always been that despite its current limitations, artificial intelligence will eventually enable judgments — like distinguishing between combatants and civilians — as good or better than those made by humans. And unlike humans, robots don’t get tired or afraid or angry.

But with policing, the interactions are more complex. There is no “enemy” to kill at will.

We need to start thinking hard about the potential for abuse and worst-case scenarios, so we can put in place firm safeguards and systems of accountability. Because your local police department might not have lethal drones today, but they probably will soon.

Loading...