Once the update arrives, Tesla vehicles will be able to drive themselves in a city the way they can perform highway cruising now, the company said. That means interpreting stop signs and traffic lights, making sharp turns, and navigating stop-and-go urban traffic and other obstacles — a far more difficult task than navigating long, relatively straight stretches of highways.
Although Tesla’s website has promised features as soon as this year including the ability to recognize and react to traffic lights and stop signs, and what it calls “Automatic driving on city streets,” the suite would still require a human driver behind the wheel.
As soon as next year, Tesla has said, the cars will be able to operate reliably on their own, even allowing the driver to fall asleep. This tiered approach is different from companies such as Waymo, whose sole aim is to launch autonomous vehicles that do not need a driver behind the wheel.
The electric-car maker said it will do that without light detection and ranging, or lidar, complex sensors that use laser lights to map the environment — technology most autonomous vehicle makers consider necessary. Even with lidar, many of those manufacturers have adopted a slow and deliberate approach to self-driving vehicles, with limited testing on public roads.
Tesla shows little sign of such caution, officials said. And because autonomous vehicles are largely self-regulated — guided by industry standards but with no clearly enforceable rules — no one can stop the automaker from moving ahead.
The Washington Post spoke with a dozen transportation officials and executives, including current and former safety regulators, auto industry executives, safety advocacy group leaders and autonomous-vehicle competitors. In interviews, they expressed worries that Tesla’s plan to unleash robo-cars on the road on an expedited timeline — likely without regulated vetting — could result in crashes, lawsuits and confusion. Plus, they said, Tesla’s promised “full self-driving” features fall short of industry standards for a true autonomous vehicle because humans will still need to be engaged at all times and ready to intervene in the beginning. Some of the people interviewed requested anonymity because of the sensitivity of the matter.
“That concern among the industry writ large is real and valid because what potentially happens is you’re going to see fatalities in the news attributed to Tesla vehicles and the response you’re going to get from certain policymakers — kind of a knee-jerk reaction,” said a former senior official with the National Highway Traffic Safety Administration, which oversees the motor vehicle industry, who spoke on the condition of anonymity so he could comment candidly about the industry view of the company’s claims. That, in turn, will affect “other manufacturers who were a lot more deliberate, a lot more careful.”
Tesla has said it already has better real-world data than the rest of the industry. The company’s artificial-intelligence program is being trained in real time by data collected from every Tesla already on the road. Every touch of the steering wheel helps inform the company’s software of how to react to various scenarios.
Tesla, which launched its first consumer vehicle just over a decade ago, was founded with the goal of bringing electric cars to the masses. It has outpaced most rivals for years, launching cars that have a range of up to 370 miles. Its Autopilot system, which keeps cars within their lanes, performs steering functions and can summon and park cars without the drivers controlling the steering wheel.
Tesla chief executive Elon Musk wants to morph that product into his “full self-driving” suite, through a combination of the hardware already in its cars and over-the-air software changes that would add increased capabilities for city driving.
The company has also said that it has a demonstrated track record of safety, registering just one crash for every 2.87 million miles in which drivers had Autopilot engaged in the first three months of the year. That compares with normal cars crashing every 436,000 miles. But Autopilot is intended for use on highways and freeways, relatively uncomplicated roads with long straightaways that have fewer crashes, so it is unclear how comparable those statistics are. Tesla has declined to release more-detailed data.
Tesla cars also would eventually connect to the “Tesla Network,” equipping them to give rides when their owners aren’t using them, similar to the ride-hailing services of Uber and Lyft.
“By the middle of next year, we’ll have over a million Tesla cars on the road with full self-driving hardware,” with the ability to find the vehicle owners, drive them to their destination and park the vehicle, Musk said at Tesla’s Autonomy Investor Day in April. It will be at “a reliability level that we would consider that no one needs to pay attention … meaning you could go to sleep.”
“The fleet wakes up with an over-the-air update,” Musk said. “That’s all it takes.”
Meanwhile, competitors are racing to build their own autonomous taxi fleets expected to transport people without drivers within a few years. Companies including Waymo, owned by Google parent Alphabet, as well as Lyft-backed Aptiv and GM Cruise are piloting autonomous vehicles in Arizona, Nevada and California — three states that have become testing grounds for self-driving cars.
Tesla is betting it can win the race with its software updates. Its approach represents a stark departure from the more conservative approaches by many companies testing self-driving cars. For instance, when Uber’s self-driving vehicle hit and killed a pedestrian, the company halted testing of its vehicles for months.
Tesla has raised eyebrows with its statements that autonomous driving can be achieved through a slimmed-down system that sheds all but the most critical equipment. Musk says he wants Tesla’s system to use a combination of cameras and radar sensors that triangulate a field of vision, similar to human eyesight, forgoing lidar. It also forgoes a driver-monitoring camera to improve safety in the cabin, instead relying on torque-sensing steering-wheel monitors to detect whether the driver’s hands are on the wheel.
Tesla executives said at an April conference that the company is using its radar and cameras to understand depth around its cars and real-world road conditions, as well as its Shadow Mode, which allows it to test how self-driving technologies perform without actually activating those features — something the company says lets it train and refine its networks without needing to do the same testing as other companies.
“Lidar is lame,” Musk said in April. Rivals are “all going to dump lidar. That’s my prediction. Mark my words.”
Meanwhile, traditional auto-industry executives have preached caution.
Former Daimler AG chairman Dieter Zetsche, who was also head of Mercedes-Benz, warned that crashes could prompt a reaction with autonomous vehicles similar to what happened after the Boeing 737 Max air crashes.
“Even if autonomous cars are 10 times safer than those driven by humans, it takes one spectacular incident to make it much harder to win widespread acceptance,” he said at an April conference.
Those interviewed say Tesla’s speed to market has created risks, especially given existing perceptions about the capabilities of Autopilot. While the company has repeatedly warned consumers not to take their hands off the wheel and to ensure they are still alert — with visual and audible cues to verify drivers are paying attention — some still ignore the guidelines as they get more comfortable with the technology. The Internet is filled with videos of Tesla drivers acting recklessly, in extreme cases taking naps or otherwise driving with their hands off the wheel as they marvel at the system.
Musk last month retweeted one such video, in which a driver expressed amazement at a Tesla driving in Autopilot as he kept his hands in the air.
The Autopilot driver-assistance system has come under scrutiny amid findings that it was active during at least three fatal crashes in the United States. Consumer Reports recently said Tesla’s Navigate on Autopilot feature, which can direct the car from on-ramp to off-ramp making lane changes on its own, was “far less competent” than a human driver, with the publication’s vice president of advocacy adding that Tesla was “showing what not to do on the path toward self-driving cars.”
In addition, in a study of 2,000 drivers released in mid-June, the Insurance Institute for Highway Safety found roughly half of drivers thought it was safe to take their hands off the wheel when using Tesla’s Autopilot, contrary to company guidance that humans should stay engaged. Six percent of the respondents thought it would be safe to nap with the system engaged, the IIHS said, double the proportion for the other automakers’ systems. Those surveyed drove a variety of cars and may not have been familiar with Tesla specifics.
“That shows already drivers are overestimating the capabilities of current technology,” said Kelly Nantel, vice president of communications and advocacy at the National Safety Council. With a name like Autopilot, “naturally you’re going to assume that the vehicle has the technology to drive on its own, and it does not.”
In a statement, the company pushed back against the IIHS study, saying that it was not representative of Tesla owners and that the company issues clear guidance on the necessity for the driver to remain engaged while using Autopilot.
The National Highway Traffic Safety Administration does not require that companies deploying self-driving cars employ a particular hardware suite or register with the federal government, though road vehicles must meet established standards for components including safety belts and brakes. Instead, it collects voluntary safety assessments from companies looking to release autonomous vehicles. The companies are not obliged to report on their activities but may choose to do so as a way of establishing credibility. Fifteen companies, including General Motors, Ford, Apple and Uber, have released voluntary safety assessments, according to NHTSA’s website.
Tesla is not among them.
In California, companies testing autonomous vehicles are required to register with the state Department of Motor Vehicles and report how many miles were driven, as well as provide detailed dispatches on each vehicle’s “disengagement,” when a human had to intervene over the course of the ride. Tesla is among the 61 companies registered with the state for testing with a human backup driver, according to the DMV. Still, the company said it did not test any vehicles in autonomous mode, as defined by California law, on public roads in 2018.
By contrast, Waymo reported in 2018 that its fleet drove 1.2 million miles in California, with an industry-leading rate of one report of human intervention per 11,017 miles driven.
One former senior federal transportation safety official said NHTSA needs to take a more active role in overseeing self-driving technologies to ensure development is not squandered by a rogue actor taking advantage of a lax regulatory environment. The official worried that one company’s overstatement of the capabilities of automation would have unintended consequences for the public and implications for the autonomous-vehicle industry as a whole.
“If the regulators don’t do something about it, then the next place there’s going to be a challenge will be in the courts when somebody gets hurt,” the official said. “Those mistakes are going to make a skeptical public even more skeptical and are going to delay the implementation of a technology that could save up to 40,000 lives a year.”
In response to concerns the agency hadn’t taken an aggressive enough role in regulating self-driving technology, NHTSA said it had “broad authority” over safety-related defects through existing federal motor vehicle regulations. The agency pointed to its authority to force recalls in cases where vehicles pose unreasonable risks, for example. The agency said it would assess Tesla’s vehicles according to its normal protocol once the technology became available.
Tesla says its Autopilot system sets it apart from other industry players, lessening the need to perform the type of testing that competitors such as Waymo have conducted. Instead, the company “has a fleet of hundreds of thousands of customer-owned vehicles that test autonomous technology in 'shadow-mode’ during their normal operation,” enabling it to improve through billions of miles of real-world driving, Tesla said in its annual disclosure to California regulators. Tesla added that its fleet has driven more than 1 billion miles on Autopilot and that the crash rate with the system activated is nearly half that during normal human driving — though the company has declined to release detailed data that can be independently verified.
On Tuesday, Musk again used the promise of autonomy to try to convince people of the value of a Tesla. He tweeted that if cars with a “full self-driving” hardware suite woke up and became autonomous, “any such Tesla should be worth $100k to $200k." The mass market Model 3 currently starts in price below $40,000. He attributed the change in valuation to the increased number of hours owners would get out of their self-driving robo-cars, which could take to the street by themselves — to 60 hours a week from 12.
Clarification: This story has been updated to more clearly explain that Tesla is launching “full self-driving” as soon as this year that still requires a driver’s attention. As soon as next year, Tesla says it may deploy fully self-driving vehicles which can operate independent of a human driver.