The Washington PostDemocracy Dies in Darkness

When driverless cars crash, who’s to blame?

( <a href="http://www.flickr.com/photos/jurvetson/5499949739">Steve Jurvetson</a> )

Self-driving cars have an undeniable allure: Think of all the fun things you could do if you didn't have to keep your eyes on the road! A lot of Americans feel this way; about half of us say we'd try a driverless car if we could. But behind the obvious technological challenges of getting these vehicles on the road are an array of scary-sounding legal questions we're going to have to grapple with if we're ever to zip around with our hands off the steering wheel.

The safety and liability implications of automation have come up in other contexts before, mainly when it comes to theorizing about armed military drones. What happens when a drone accidentally shoots a civilian? Is the person responsible the designer of the machine, the person who programmed its software, its commander or some other individual?

Those questions don't get any less complicated in a peacetime environment. In a recent paper, Brookings Institution scholar John Villasenor explains that automation isn't even a binary thing:  the government already considers some cars as autonomous, including vehicles that come with technologies to counteract lane drift or that maintain a certain distance from the vehicle ahead. The National Highway Traffic Safety Administration has a system that categorizes cars based on how much automation they have — from zero, "no automation," to four, "full self-driving automation."

It's hard enough to assign responsibility for a collision involving two human drivers, let alone a crash involving a driverless car, let alone a car that's only autonomous some of the time. If a driver begins drifting out of his lane but the computer assist fails to kick in, who's to blame?

Villasenor acknowledges these are tricky questions, but ones our legal system is mostly equipped to deal with. Whether it's classifying a slip-up as a design defect, a breach of warranty, negligence or some other mistake — all of which have concrete definitions under the law — we've at least got a vocabulary for talking about errors when they happen.

The trickier challenge is for the manufacturers to decide how to teach autonomous vehicles to make "good" decisions. Does a driverless car have an ironclad obligation to protect its occupants, even to the point of putting non-passengers at risk? Framed that way, it sounds like a no-brainer. But when the non-passengers happen to be cyclists or pedestrians, it gets a lot more complicated.

The future also promises to turn car manufacturers into software companies, as automated systems will need periodic upgrades and programming changes.

Maybe driverless cars will come plastered with all sorts of alerts about what could happen if you're not paying attention, too.

"Manufacturers tend to err on the side of being very conservative in issuing such warnings," writes Villasenor. "As manufacturers introduce new forms of vehicle automation, they will no doubt include copious warnings about the attendant risks."

Or maybe driverless cars might actually suggest when a human should take over for certain stretches of roads it knows to be dangerous — something that could be easily implemented given all the data we collect about car crashes. That way, the car manufacturer wouldn't be liable in the event the driver gets in a fender-bender.

The prospect of having to grapple with this legal gray area — not to mention the thought of putting your life in the hands of a machine — just might be enough to turn you off of self-driving cars forever. But the big picture is also important. In light of how many people get injured or killed every year as a result of mistakes they or other humans made behind the wheel, self-driving cars could be a huge step up in terms of safety. If self-driving cars cause a handful of crashes every year but prevent thousands, it might be worth it.

Loading...