If your robot car crashes, who pays the bill?
Wired’s article on self-driving cars suggests that the remaining hurdles are at least as much legal as they are technological:
Beyond bureaucracy, there are deeper legal questions. Ryan Calo, director for privacy and robotics at Stanford Law School’s Center for Internet and Society, which is studying the legal framework for quasi-autonomous vehicles, notes how active the liability landscape already is when it comes to cars’ safety features. “People sue over all kinds of stuff. People sue because some feature that was supposed to protect them didn’t. People sue because their car didn’t have a blind-spot warning when other cars at the same price point did.” Imagine the complexity we’ll have when cars drive themselves. Who will be responsible for their operation — the car companies or the drivers? What happens, for example, when a highway patrol officer pulls over a self-driving car? Who gets the ticket?
As a RAND report observed, even as automakers create more semiautonomous technologies, they “will want to preserve the social norm that crashes are primarily the moral and legal responsibility of the driver, both to minimize their own liability and to ensure safety.” Consider what happened to the remote-parking assistant BMW developed a few years ago for getting into narrow spots. “You push a button and the car goes in and parks itself” while the driver waits outside, says Donald Norman, the Design of Future Things author. When he asked BMW executives why he didn’t see it on the market, Norman says he was told, “The legal team wouldn’t let them go forward.”
Meanwhile, James Fallows is waiting on flying cars.