“The National Transportation Safety Board (NTSB) appreciates the productive and professional cooperation extended by Tesla’s technical staff to our investigators over the course of our various crash and incident investigations, such as our recent work in [fatal crashes in] Spring, Texas, and Coral Gables, Florida,” she wrote. “I am deeply concerned, however, that Tesla’s action — or rather, inaction — to implement critical NTSB safety recommendations has not demonstrated the same productivity or professionalism.”
Homendy raised similar concerns in an interview with The Washington Post last month, saying the board had made and reiterated recommendations following fatal crashes in Williston and Delray Beach, Fla., as well as in Mountain View, Calif., but they had gone unanswered.
Teslas are equipped with a suite of driver-assistance features called Autopilot that can navigate the vehicles on highways and in parking scenarios, provided a driver is paying attention. The most recent evolution of the features is termed “Full Self-Driving” and expands its functionalities to city and residential streets.
In the Williston crash in 2016, a speeding Tesla collided with a tractor trailer, killing the Tesla’s driver, when the Autopilot mode did not register the trailer’s side against a brightly lit sky after the truck turned in front of the Tesla.
The Delray Beach crash also involved the use of Autopilot and a tractor trailer. In that 2019 crash, a tractor trailer traveling slower than traffic blocked the Tesla’s path on a highway after pulling out from a private driveway, federal investigators said. The car’s roof sheared off as it drove under the truck, the NTSB said.
And an Apple engineer was killed in 2018 when his Tesla in Autopilot mode slammed into a highway barrier; the driver had a game active on his iPhone at the time of the crash, investigators said, and they cited overreliance on the Autopilot system as well as Autopilot system limitations and shortcomings in driver monitoring.
Following its recommendation on driver monitoring, which was issued to six manufacturers, Homendy said all of the other automakers had responded.
“The other five manufacturers responded to us, describing the actions they planned to take, or were taking, to better monitor a driver’s level of engagement,” she wrote in the letter. “Tesla is the only manufacturer that did not officially respond to us about the recommendation.”
She said the NTSB had issued its recommendations to Tesla on Sept. 28, 2017, following the Williston crash.
Tesla did not respond to a request for comment. The company has argued that Autopilot is safer than normal driving, when crash data is compared. “Autopilot is unequivocally safer,” Musk has said. But such data is not directly comparable because Autopilot is supposed to be limited to certain road types and conditions.
The National Highway Traffic Safety Administration, the top federal auto safety regulator, is investigating Autopilot’s role in about a dozen crashes involving parked emergency vehicles while the system was activated.
Meanwhile, Homendy said, Tesla was adding new functionality with the features it calls ‘Full Self Driving’ without addressing the board’s prior concerns.
“You have stated that ‘safety is always the primary design requirement for a Tesla,” she wrote. “Now that statement is undercut by the announcement that Tesla drivers can request access to ‘Full Self-Driving Beta technology,’ operational on both highways and city streets, without first addressing the very design shortcomings that allowed the fatal Williston, Delray Beach, and Mountain View crashes to occur.”