The Washington PostDemocracy Dies in Darkness

Tesla’s recent Full Self-Driving update made cars go haywire. It may be the excuse regulators needed.

The National Highway Traffic Safety Administration previously asked Tesla for increased transparency around updates.

The Tesla logo on a 2020 Model X at a Tesla dealership in Littleton, Colo., in February. Tesla sent out and quickly rolled back a faulty Full Self-Driving update late last month, catching the eye of regulators. (David Zalubowski/AP)
Placeholder while article actions load

SAN FRANCISCO — For a few hours late last month, Tesla cars began behaving erratically after receiving an overnight software update. Cars suddenly started slamming on the brakes at highway speeds, owners reported, risking collisions. CEO Elon Musk took to Twitter to acknowledge a problem with the software and vow that the update was being rolled back.

Ordinarily, that would have been the end of it for Musk, whose company has often flouted standard regulatory practices. But late last month, Tesla unexpectedly reported the glitch to the National Highway Traffic Safety Administration and issued an official recall notice detailing the problem which may have affected nearly 12,000 vehicles.

Tesla’s decision comes as the Biden administration has stepped up enforcement of federal safety regulations regarding advanced driver-assistance systems — particularly Tesla’s habit of issuing software fixes without reporting underlying problems. Last month, NHTSA publicly dinged Tesla for failing to issue a formal recall when it issued a separate software update that enabled its cars to better detect parked emergency vehicles in low light. The update followed around a dozen prior crashes involving parked emergency vehicles while Autopilot was activated.

It wasn’t an isolated instance of criticism for the electric carmaker.

Tesla is putting ‘self-driving’ in the hands of drivers amid criticism the tech is not ready

The National Transportation Safety Board, which has investigated multiple crashes involving Tesla’s Autopilot software, has publicly called out the automaker over its failure to follow up on its safety recommendations. NHTSA, the top federal auto safety regulator, is investigating the Autopilot software itself — bringing potential regulatory authority to the equation. Musk has taken aim at federal regulators, but they haven’t budged. Transportation Secretary Pete Buttigieg said recently the CEO was free to take up his concerns directly with him.

Longtime auto industry observers and safety experts say Tesla is getting the message, even if Musk continues to be combative. The repeated incidents have potential financial and legal ramifications for Tesla, they said.

“Inside Tesla, there has been a shift,” said John Rosevear, senior auto analyst at the Motley Fool. Employees are concerned that “ ‘We’re exposing ourselves here and we need to maybe get more serious about this,’ " he added.

Tesla tempted drivers with ‘insane’ mode and now is tracking them to judge safety. Experts say it’s ludicrous.

Tesla, which has disbanded its public relations department, did not respond to a request for comment. The company has argued in the past that using Autopilot is safer than normal driving, based on comparisons of crash data. Musk has called Autopilot “unequivocally safer.” Upon confirming the Full Self-Driving rollback, he said occasional issues are “to be expected with beta software,” which is intended to be tested in a variety of conditions to iron out problems.

Company officials pledged to work with NHTSA in a recent earnings call, saying Tesla embraced the scrutiny on its software.

The official recall notice marks a sharp departure from Tesla’s more typical mode of operation, in which it acts like its Silicon Valley neighbors to send update after update to the software that powers its products, making fixes in real time.

But Tesla vehicles are also being beta tested in real time on the road — and the latest updates highlight the heightened dangers that come with putting software that is still a work in progress in the hands of drivers.

Full Self-Driving is the latest iteration of the company’s software, now in the hands of roughly 12,000 drivers who paid as much as $10,000 to upgrade and received early access or passed a safety screening. It adds capabilities to navigate city and residential streets, with an attentive owner behind the wheel at all times. The features are not autonomous by regulatory and industry standards.

But already drivers have been reporting issues. Videos uploaded to social media show the software struggling to navigate roundabouts, veering toward pedestrians and even abruptly turning toward oncoming traffic.

Even Musk acknowledged in July that the software — which began its roll out to users a year ago — was a “debatable” proposition for potential subscribers.

The company’s less advanced version of the software, dubbed Autopilot, is standard on Tesla vehicles. The software can navigate highways from on-ramps to off-ramps, and can also steer within marked lanes.

Elon Musk moved to Texas and embraced celebrity. Can Tesla run on Autopilot?

Things started escalating when NHTSA announced over the summer it would begin requiring Tesla and other manufacturers to report on incidents involving advanced driver-assistance systems, such as Autopilot. And in August, the agency launched a formal probe of Autopilot after nearly a dozen crashes involving parked emergency vehicles.

One of Tesla’s latest run-ins with NHTSA came in October, after Tesla didn’t notify officials of the emergency vehicle software update in September. It issued the update shortly after the government opened a probe into Tesla collisions with emergency vehicles.

NHTSA notified Tesla that such an action would typically be initiated through the federally established recall process, intended to remedy urgent safety risks through a combination of manufacturer expertise and government oversight.

“Any manufacturer issuing an over-the-air update that mitigates a defect that poses an unreasonable risk to motor vehicle safety is required to timely file an accompanying recall notice to NHTSA,” an agency official wrote in the Oct. 12 letter.

Elon Musk says Tesla has not signed contract with Hertz. Hertz says car deliveries have already started.

The agency also issued a letter in October raising concerns about another practice: requiring Full Self-Driving beta testers to sign a nondisclosure agreement prohibiting them from sharing certain information about the software beta. The agency noted it relies on the feedback from the public to learn of potential safety issues.

Tesla did away with the agreement, according to Musk, who compared it to toilet paper.

Kevin Smith, of Murfreesboro, Tenn., drives a Tesla Model Y and is part of the beta test. On Oct. 24, he hoped to test out the latest update but instead was locked out of the system, he said. And as he tried to get it to work, he heard from a fellow beta tester.

“He was screaming ‘Do not use it! Do not use it!' ” Smith said. “'We are trying to wake up the folks at Tesla, trying to get the word to Tesla.'”

Tesla is like an ‘iPhone on wheels.’ And consumers are locked into its ecosystem.

One of Smith’s fellow testers had shared a video showing one of the emergency braking events — including “a pretty dramatic slamming of the brakes,” he recalled. “For that to trigger undesirably at high speeds is an incredibly dangerous event.”

Smith noticed later that day Tesla had also remotely disabled his auto emergency braking and forward collision warning functions, safety features that he would ordinarily keep activated. And they hadn’t let him know.

“Dear @elonmusk, are you in there crossing the streams? I didn’t change this brah,” he wrote in a tweet. “This isn’t ok without any communication. Communication is not hard. I’m doing it now. Please advise.”

By the following day, NHTSA had asked Tesla for more information about the incident, according to NHTSA spokeswoman Lucia Sanchez.

Auto safety experts say Tesla’s tweaks to safety features — without any notice to owners — were an unprecedented violation of trust. And it was exactly the type of behavior that had triggered the attention of federal auto safety regulators in the past.

Tesla owners can now request ‘Full Self-Driving,’ prompting criticism from regulators and safety advocates

Carnegie Mellon University professor Philip Koopman, who focuses on autonomous vehicle safety, described it as “incredible — not in a good way.”

“If you’re testing, you need to know if they’re changing your vehicle out from under you,” he said. “Just taking that away and not making it super super obvious to drivers that that’s happened is extremely concerning.”

Tesla submitted its recall notice on Oct. 29. In the bulletin, the company detailed why corrective action was necessary. It was the first time many drivers learned of what actually happened. That document, which confirmed what Tesla called the “false-positive braking” and accompanying warning chimes experienced by drivers, detailed the sequence of events that resulted from the buggy software update.

“Tesla began to receive reports of false [forward collision warning] and [automatic emergency braking] events from customers,” it wrote. “In a matter of hours, we investigated the reports and took actions to mitigate any potential safety risk. This included cancelling [the update] on vehicles that had not installed it, disabling FCW and AEB on affected vehicles, and/or reverting software to the nearest available version.”

Tesla also laid out the risk to owners. “If the AEB system unexpectedly activates while driving, the risk of a rear-end collision from a following vehicle may increase,” it wrote.

While they were asleep, their Teslas burned in the garage. It’s a risk many automakers are taking seriously.

Safety experts say it appears Tesla issued the recall because of the mounting regulatory pressure. NHTSA‘s Sanchez declined to say whether the recall came at federal safety regulators’ urging. “NHTSA will continue its conversations with Tesla to ensure that any safety defect is promptly acknowledged and addressed,” she said.

Publicly, Musk has reacted to the increased regulatory attention. He’s taken aim on Twitter at the Biden administration and federal auto safety appointees from both the NTSB and NHTSA.

He lashed out last month after NHTSA appointed Duke University professor Missy Cummings, who has been critical of Tesla’s Autopilot and autonomous ambitions, as senior safety adviser.

In a tweet, he called her track record “biased.” Tesla-supporting Twitter users swarmed her account, attacking her record. Buttigieg defended the appointment and invited Musk to raise his concerns with him directly, according to news reports.

Elon Musk and Tesla's cozy relationship with the government gets tested

NHTSA declined to make Cummings available for an interview.

NTSB chair Jennifer Homendy recently spoke with The Post and other publications, publicly scolding Tesla for rolling out new features without addressing prior recommendations about Autopilot. Those included instituting better driver monitoring, and implementing safeguards to make sure it is used in the conditions for which it is intended.

Musk tweeted her Wikipedia page soon after. His followers also went after her.

Elon Musk took to a Twitter poll to decide whether to sell a tenth of his Tesla stock. Twitter users said yes.

Over the weekend, Musk reacted to a different type of government attention: a U.S. Senate proposal to institute a billionaire’s tax that would target unrealized gains of the richest Americans. He created a poll on Twitter, in which 58 percent of the more than 3 million responses indicated he should sell 10 percent of his shares in Tesla.

“Much is made lately of unrealized gains being a means of tax avoidance, so I propose selling 10% of my Tesla stock,” he wrote, adding in a later tweet: “I will abide by the results of this poll, whichever way it goes.”

Musk sold about $5 billion worth of stock this week — some of it in a planned move related to his stock options — throwing the purpose of the poll partially into question.

Loading...