New details about the collision last weekend involving one of Uber’s self-driving vehicles raise questions about how self-driving technology, which is still under development, will respond to roadway scenarios where human behavior and common driving practices may not always align with the letter of the law.
“As far as I could tell, the third lane had no one coming in it so I was clear to make my turn,” Cole wrote in her testimony. “Right as I got to the middle lane about to cross the third, I saw a car flying through the intersection but couldn’t brake fast enough to completely avoid collision.”
That car was an Uber SUV that employees Patrick Murphy and Matthew Rentz were operating in self-driving mode. The car was going at an estimated 38 mph, Murphy wrote, two miles per hour under the posted speed limit.
“The traffic signal turned yellow as I entered the intersection,” Murphy wrote. “As I entered the intersection, I saw the vehicle turn in left. … There was no time to react as there was a blind spot created by the line of traffic in the southbound left lane.”
Cole struck the Uber SUV, which then hit a traffic signal pole. The Uber SUV flipped on its side and collided with two other cars before coming to a stop.
No serious injuries were reported, although some drivers complained of soreness and whiplash.
It appears that Cole is to blame for failing to yield to oncoming traffic, but the collision unfolded in a way with which most motorists can sympathize. Drivers often slip through lanes of traffic when other cars are at a standstill or pick up their speed to make a light before it switches. How machines respond to those behaviors, and whether they also engage in them, is something engineers still have to sort out.
Uber temporarily halted its self-driving fleet over the weekend but returned the vehicles to the road on Monday. A spokeswoman said Wednesday that its self-driving vehicles will cross intersections at a yellow light if the vehicle has enough time to do so at its current speed. The car will stop at a yellow light if it can do so comfortably or does not have enough time to cross, she said.
The spokeswoman added that vehicle operators have the authority to take control of self-driving vehicles in unsafe or dangerous situations, when the field of view is compromised or if they are not comfortable proceeding through a yellow light.
As with many roadway accidents, blame may be in the eye of the beholder. While Cole failed to yield, a witness named Brayan Torres said, it appeared that the Uber SUV hustled through the intersection to avoid getting stuck at a red light.
“It was the other driver’s fault for trying to beat the light and hitting the gas so hard,” Torres wrote in his testimony for the police report.
Last year, a Tesla Model S operating in autopilot mode was involved in a fatal car accident in Florida. That collision garnered national headlines and sparked an investigation by federal regulators. The National Highway Traffic Safety Administration found no defects with Tesla’s software.
Read more from The Washington Post’s Innovations section.