Tesla’s cars will in August suddenly activate “full self-driving features,” the company's chief executive Elon Musk tweeted on Sunday, three days after federal investigators said a Tesla SUV driving semi-autonomously had accelerated over 70 mph and smashed into a highway barrier.
A Tesla spokesperson on Monday said the cars would only start offering a limited number of as-yet-undisclosed features, not full autonomy itself. But safety experts worried the grand promises of full self-driving capabilities could lull drivers into a false sense of security for technologies that are still largely unproven on the road.
Consumer groups argue Tesla's marketing — and even the name “Autopilot,” which calls to mind a free-flying jet — contributes to a dangerous misunderstanding for drivers, suggesting they can take their hands off the steering wheel.
“Tesla has a history of using consumers as guinea pigs,” said David Friedman, the director of cars and product policy at Consumers Union, the advocacy arm of Consumer Reports. Tesla’s “misleading” marketing, he said, has had the dangerous effect of “providing overconfidence and building you up to thinking it’s safer than it actually is.”
Musk tweeted that Tesla’s Version 9, a software update that will automatically install in Tesla cars, would “begin to enable full self-driving features” and entirely fix certain issues with “Autopilot,” the name for the company’s package of semi-driverless features such as advanced cruise control. Autopilot resources, he said, had so far “rightly focused entirely on safety.”
But consumer groups say Tesla’s marketing has often skirted the lines between encouraging responsible driving and suggesting the cars can drive themselves. Directors of two consumer groups, the Center for Auto Safety and Consumer Watchdog, wrote a letter last month urging the Federal Trade Commission to investigate Tesla for “deceptive and unfair practices” in their advertising of what Autopilot can do.
One driver, 38-year-old Walter Huang, was killed in March when his Tesla Model X P100D sped up and then steered into a highway barrier while driving on Autopilot, according to a National Transportation Safety Board report released last week. Huang had set the car to drive at 75 mph, but his hands were detected on the steering wheel just 34 seconds of the last minute before impact.
Tesla encourages drivers to pay attention while Autopilot is engaged, and its cars include features that nudge drivers into taking over if, for instance, they’re not grabbing onto the steering wheel. But the company has also heavily promoted its cars’ supercharged ability to fend for itself: The Tesla website promises “Full Self-Driving Hardware on All Cars,” which it says offer “a safety level substantially greater than that of a human driver.”
Tesla’s cars are routinely featured in online videos showing drivers misusing Autopilot while at high speed. One video recorded last week shows an official Tesla vehicle speeding along a California freeway while its driver is facing down, seemingly not paying attention or falling asleep. Tesla representatives said they are investigating the incident.
Missy Cummings, the director of Duke University's Humans and Autonomy Lab, said Musk’s self-driving statement was an “attempt to generate hype” and that his assertion of a focus on safety was “laughable.”
“It took the Joshua Brown death to make them take driver monitoring more seriously,” Cummings said, referring to a fatal crash in Florida in 2016, during which a Tesla driving on Autopilot slammed into a semitrailer truck. “And, as pointed out by NTSB, it still failed to prevent (Huang’s) death, and indeed likely caused it.”
Buyers of Tesla’s sedan or SUV, including the $140,000 Model X P100D, can pay an extra $5,000 for “Enhanced Autopilot,” a package of still-experimental features that the company says could include “on-ramp to off-ramp” autonomous freeway driving. Drivers can prepay another $3,000 on top of that for its “Full Self-Driving Capability” package, which the company advertises as “All you will need to do is get in and tell your car where to go.”
But Tesla has shared little about how it has tested these features, Friedman said, adding that the treating of self-driving capabilities as easy software updates could have deadly results.
“We can’t treat cars like we do phones or computers,” he said. “There’s so much more at stake when you've got two to four tons of metal, glass and plastic hurtling down the road at 70 mph.”