Walter Huang, 38, who was wearing a seat belt, was pulled from the blue Tesla by bystanders before the car was engulfed in flames. He died at a hospital.
Tesla has said it makes “extremely clear that Autopilot requires the driver to be alert and have hands on the wheel.”
The NTSB’s preliminary report on the crash adds to questions about the safety of Tesla’s Autopilot technology. It is not a driverless system but is often treated as one by drivers.
Tesla executives — who have been locked in a clash with the NTSB that has left them outside the federal investigation — did not immediately comment on the findings, instead pointing to a blog post from March that notes the safety benefits of the company’s technology.
In that earlier statement, the company said that “the driver had about five seconds and 150 meters of unobstructed view . . . but the vehicle logs show that no action was taken.”
Federal investigators did not provide a probable cause of the crash, and the inquiry is ongoing. But the NTSB said Huang was relying on Autopilot’s advanced driver-assistance features at the time of the accident: “traffic-aware cruise control,” which was set at 75 mph and is meant to keep the vehicle a safe distance from other cars, and “autosteer lane-keeping assistance,” which is intended to help guide the car safely down the highway.
Autopilot was used continuously for the 18 minutes and 55 seconds before the crash, investigators said. Toward the beginning of that period, the car “provided two visual alerts and one auditory alert” for Huang to put his hands on the wheel, according to the NTSB.
“These alerts were made more than 15 minutes prior to the crash,” according to the report. It is unclear why the alerts did not continue.
In the minute before impact, Huang’s hands were detected on the wheel three separate times, totaling 34 seconds. But they weren’t detected on the wheel during the last six seconds, the NTSB said.
Experts in human behavior and auto safety have warned about the dangers of drivers becoming overly reliant on semi-automated features. That has led some developers to focus on fully driverless technology, which requires nothing from passengers. Tesla has defended its iterative approach, arguing that its features, in conjunction with attentive humans, are already providing for significantly safer cars.
Driverless testing, which is not regulated by federal officials and is done widely on public roads, has also had problems. The NTSB found last month that one of ride-hailing giant Uber’s driverless cars misidentified a pedestrian before it hit and killed her in Tempe, Ariz., in March. Uber had deliberately disabled the emergency braking system on the vehicle, the NTSB found.
In the Tesla incident, in the southbound lanes of U.S. Highway 101 on March 23, Huang was traveling in the second lane from the left, a high-occupancy-vehicle lane. Eight seconds before the crash, “the Tesla was following a lead vehicle and was traveling about 65 mph,” the speed limit, according to the NTSB.
Seven seconds before the crash, the Tesla, still following the car ahead, “began a left steering movement.” At that point, it entered a “triangular-shaped boundary” between the main travel lane and an exit lane.
Four seconds before the crash, the Tesla “was no longer following a lead vehicle.”
A second later, the car sped up to almost 71 mph, then smashed into a safety device called an attenuator, which is supposed to act as a “smart cushion” reducing the severity of a high-speed impact. But the safety device had been damaged 11 days before by a Toyota Prius.
Tesla said earlier that “the reason this crash was so severe is because the crash attenuator, a highway safety barrier which is designed to reduce the impact into a concrete lane divider, had been crushed in a prior accident without being replaced. We have never seen this level of damage to a Model X in any other crash.”
In April, Tesla and the NTSB took the unusual step of parting ways, and the company is no longer a party to the investigation. They differ on whether Tesla was pushed from the inquiry. The NTSB had objected to Tesla making statements concerning the cause of the crash before the investigation was complete, while Tesla accused the agency of releasing incomplete information and “trying to prevent us from telling all the facts.”
In one of the disputed statements, Tesla said that “the crash happened on a clear day with several hundred feet of visibility ahead, which means that the only way for this accident to have occurred is if Mr. Huang was not paying attention to the road, despite the car providing multiple warnings to do so.”
The company said it empathizes with the family’s grief, “but the false impression that Autopilot is unsafe will cause harm to others on the road.”