On Friday the company tried to explain the March 23 crash that killed Walter Huang, an Apple engineer whose electric SUV was on Autopilot mode when it crashed into a median on Highway 101 in Mountain View, Calif. In about 560 words, Tesla sought to counter that alarming photo, using statistics and figures to argue that an artificially intelligent driver is still safer than a human one.
Still, Tesla had to acknowledge two realities made clear by Huang’s death: Autonomous vehicle technology is still in its infancy, and, because no tech is perfect, people in even the most advanced cars will still be involved in fatal crashes.
“In the past, when we have brought up statistical safety points, we have been criticized for doing so, implying that we lack empathy for the tragedy that just occurred,” the company’s statement said. “Nothing could be further from the truth. We care deeply for and feel indebted to those who chose to put their trust in us. However, we must also care about people now and in the future whose lives may be saved if they know that Autopilot improves safety.”
The company said it was “incredibly sorry” for the loss Huang’s family suffered. A friend described Huang, 38, as “just a straight-up, caring guy.” He was also an early Tesla adopter.
Huang’s family members told San Francisco ABC affiliate KGO-TV that Huang had complained to his Tesla dealership that his SUV would swerve toward the same median where he was later killed. It was unclear Saturday whether the company had identified the issue before the crash, or what — if anything — it had done to address it.
In its statement, Tesla said several things contributed to the crash. The highway crash attenuator, a safety barrier that is supposed to absorb much of the force of a high-speed crash, that Huang’s SUV smacked into had been crushed in a previous wreck and could not disperse the force of the Tesla’s collision.
Huang also shared some of the blame, Tesla said. Even in Autopilot, Tesla’s vehicles are only semiautonomous, the company said. The driver is expected to remain alert and ready to take over if something comes up that the vehicle cannot handle. Huang apparently wasn’t paying enough attention, Tesla said.
“The driver had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds before the collision,” the company said. He had “about five seconds and 150 meters of unobstructed view of the concrete divider with the crushed crash attenuator, but the vehicle logs show that no action was taken.”
But the biggest question — why the Autopilot steered into the barrier in the first place — remains unaddressed. The National Transportation Safety Board, the California Highway Patrol and Tesla are all investigating.
What is clear is that the crash — and the safety questions it raises — have contributed to a tough week for Tesla and others who want to make cars that drive themselves ubiquitous on American roads.
The automaker also has been stymied by the logistical demands of putting new cars on the road at the ambitious pace set by its CEO. In May 2016, Musk said Tesla would “aim to produce 100,000 to 200,000 Model 3s in the second half” of 2017, The Washington Post wrote. Model 3 sales in that period totaled 1,770.
Last August, Musk said the company would be producing 5,000 units a week by the end of the year. But by October, Musk said that ramping up production was “manufacturing hell” and that the company’s current forecast is for 5,000 Model X vehicles a week by the end of June.
“Musk is struggling with the … prosaic mission of assembling a passenger car here on Earth,” The Post’s Steven Mufson wrote, comparing Musk’s ambitions for space exploration to his earthly ones. “And in explaining a series of production misses over the past two years, some analysts say Musk has undermined his own credibility by repeatedly overpromising.”
Still, after Huang’s crash, the company that built his car is still making strong statements. Autopilot reduces crash rates by as much as 40 percent, Tesla’s statement said. A person driving a Tesla with Autopilot hardware is 3.7 times less likely to be involved in a fatal accident.
“Tesla Autopilot does not prevent all accidents — such a standard would be impossible — but it makes them much less likely to occur,” the statement said. “It unequivocally makes the world safer for the vehicle occupants, pedestrians and cyclists.”
Correction: An earlier version of this story suggested that Tesla and its chief executive Elon Musk encouraged drivers to take their hands off the steering wheel. Tesla says its Autopilot technology is not meant to relieve drivers of responsibility for being alert or in control, or for having their hands on the wheel.