The driver of a Tesla who was killed in a crash that drew worldwide attention last year was too reliant on the car’s “Autopilot” system when he plowed into the side of a tractor-trailer at more than 70 miles per hour, federal investigators concluded Tuesday.
The National Transportation Safety Board said Joshua Brown’s overreliance on the autopilot system “permitted his prolonged disengagement from the driving task and his use of automation in ways inconsistent with guidance and warnings from the manufacturer.” The Tesla’s “Autopilot” system functioned as designed in the May 7, 2016 crash. However, the system is meant to augment, not replace the driver, the NTSB said.
“In this crash, Tesla’s system worked as designed, but it was designed to perform limited tasks,” NTSB Chairman Robert Sumwalt said. “The result was a collision that should not have happened. System safeguards were lacking.”
The board said the “operational design” of the vehicle’s autopilot encouraged Brown’s overreliance on it. “Drivers must always be prepared to take the wheel or apply the brakes,” Sumwalt said.
The NTSB findings came as a partial exoneration of Tesla and a relief for those working to put autonomous vehicles on the road.
Linking the Tesla crash to the coming generation of fully-autonomous cars fueled public fears of vehicles, surveys found.
"I think it's important to clear up a possible misconception," Sumwalt said. "The automobile involved in the collision was not a self-driving car."
In the aftermath of the crash, Tesla put more stringent limits on hands-off driving, disabling the autopilot feature if drivers repeatedly ignore the audible and dashboard warnings.
Among the NTSB recommendations Tuesday, the board said automakers should incorporate similar measures and restrict use on highways with cross traffic.
An NTSB investigator testified Tuesday “collision mitigation systems” do not reliably detect cross traffic. The crash has been documented by at least three teams of investigators, including one from the NTSB, which issued a preliminary report in June.
Brown, 40, a former Navy SEAL, was driving down four-lane highway near Willistown, Fla., on a sunny Saturday afternoon with his Tesla Model S set in autopilot mode. The system allows the vehicle to guide itself — using multiple sensors linked to a computer system — like a greatly enhanced cruise control system, and comes with automatic emergency braking designed to avoid frontal collisions.
Two minutes earlier, according to reports, Brown had set the speed at almost 10 miles per hour above the posted speed limit.
At about 4:40 p.m., a 53-foot tractor-trailer loaded with blueberries that had been traveling in the opposite direction turned left toward a side road, blocking the path of Brown’s Tesla.
The Tesla careened under the truck’s trailer, traveled almost 300 feet farther and snapped off a utility pole, spinning around into a front yard about 50 feet away.
The driver of the blueberry truck, Frank Baressi, 62, told the Associated Press Brown was “playing Harry Potter on the TV screen.” The Florida Highway Patrol said a DVD player was found in the Tesla, but two of the NTSB investigators on Tuesday disputed it was being used to watch a video.
“We are quite certain that was not the case,” the NTSB’s Ensar Becic told the board members.
In its preliminary report, the NTSB said Brown had his hands on the wheel for just 25 seconds in the final 37 minutes of his drive. The report said he had received six audible warnings and seven visual dashboard warnings, from the autopilot systems telling him to keep his hands on the steering wheel.
The National Highway Traffic Safety Administration joined the NTSB, the highway patrol and Tesla in investigating the crash. NHTSA determined Tesla’s autopilot feature was not a fault, and its investigators said Brown never tried to avoid the truck or apply the brakes before the crash.
Most of the headlines in the aftermath of the crash were accurate, but others confused the Tesla with a fully-autonomous vehicle. A British science magazine headlined, "Tesla driver dies in first fatal autonomous car crash in U.S.", while CNN asked "Can we trust driverless cars?" and the headline on CBS was "This fatality could slam the brakes on driverless cars".
There were just the sort of headlines automakers who plan to launch genuine driverless cars dread. The Tesla autopilot is a Level 2 system, with Level 5 as the standard for a fully autonomous car.
With a public already skeptical about fully autonomous cars, reaction to the initial mishaps may play a significant role in determining how quickly Americans get comfortable with the new cars.
Traditional automakers plan to gradually introduce features until the day arrives when they've produced a fully autonomous vehicle. But newcomers to the market, like Waymo, plan to put fully autonomous vehicles on the road from day one. Waymo, which changed its name from Google to develop an independent brand, concluded a vehicle without a steering wheel or pedals was the way to go after discovering its own employees often got distracted when driving autonomous cars equipped with steering wheels.
Anticipating the attention paid to Tuesday’s NTSB hearing, Tesla issued a statement saying NHTSA has found Autopilot can reduce crashes by 40 percent.
“We appreciate the NTSB’s analysis of last year’s tragic accident and we will evaluate their recommendations as we continue to evolve our technology,” the Tesla statement said.
“We will also continue to be extremely clear with current and potential customers that Autopilot is not a fully self-driving technology and drivers need to remain attentive at all times.”
Brown's family issued a statement Monday through its lawyer.
"We heard numerous times that the car killed our son. That is simply not the case," the family statement said. "There was a small window of time when neither Joshua nor the Tesla features noticed the truck making the left-hand turn in front of the car. "