Owners will have to agree to let Tesla monitor their driving behavior through the company insurance calculator. Tesla issued a detailed guide specifying the criteria under which drivers would be graded. If their driving is deemed to be “good” over a seven-day period, Musk said on Twitter, “beta access will be granted.”
It’s the latest twist in a saga that has regulators, safety advocates and relatives of Tesla crash victims up in arms because of the potential for chaos as the technology is unleashed on real-world roads. Until now, roughly 2,000 beta testers have had access to the technology.
As recently as July, Musk said the technology was a “debatable” proposition, arguing that “we need to make Full Self-Driving work in order for it to be a compelling value proposition.”
And already, investigators are looking at its predecessor, dubbed Autopilot, which navigates vehicles from highway on-ramp to off-ramp and can park and summon cars, with a driver monitoring the software. The National Highway Traffic Safety Administration opened an investigation last month into about a dozen crashes involving parked emergency vehicles while Autopilot was engaged.
“Full Self-Driving” expands Autopilot’s capabilities to city streets and offers the ability to navigate the vehicle turn-by-turn, from point A to point B.
Tesla and NHTSA did not immediately respond to requests for comment. Tesla has repeatedly argued that Autopilot is safer than cars being driven manually when the modes are compared using Tesla data and information from NHTSA.
Tesla’s move to rapidly roll out the features to large numbers of users is drawing criticism from regulators and industry peers who say the company is taking a hasty approach to an issue that requires careful study and an emphasis on safety.
Despite its name, the new software does not qualify as “self-driving” under criteria set by the auto industry or safety regulators, and drivers should always pay attention while it is activated.
“I do think that their product is misleading and overall leads to further misuse and abuse,” said National Transportation Safety Board Chair Jennifer Homendy, before turning to Musk himself. “I’d just ask him to prioritize safety as much as he prioritizes innovation and new technologies … safety is just as important, if not more important, than the development of the technology itself.”
As for the evaluation period for drivers who want to sign up, Tesla posted its “safety score” system on its website shortly before the button’s release. It said drivers would be scored on a 0 to 100 criteria, with most receiving 80 or above. Drivers will be assessed on five factors, it said: forward collision warnings per 1,000 miles, instances of hard braking, aggressive turning, unsafe following and forced disengagements of the Autopilot system. Tesla would then use a formula to calculate their score.
“These are combined to estimate the likelihood that your driving could result in a future collision,” Tesla wrote. It was not immediately clear what score would qualify as “good” — as characterized by Musk — in order to receive Full Self-Driving.
Musk had earlier said drivers who make frequent use of the company’s Autopilot software will be rated favorably. Owners will be able to track their progress in real time, he said, and will be guided on how they can satisfy the requirements.
Late last month, industry group Chamber of Progress took aim at Tesla’s marketing of the technology.
Tesla’s cars “aren’t actually fully self-driving,” wrote the group, which is supported by Apple, Alphabet-owned Waymo and General Motors-backed Cruise. “The underlying issue here is that in case after case, Tesla’s drivers take their eyes off the road because they believe they are in a self-driving car. They aren’t.”
Homendy, the NTSB chair, said Tesla has not shown an active interest in improving the safety of its products. She said that the board has made recommendations stemming from fatal crashes in Williston and Delray Beach, Fla., as well as in Mountain View, Calif., but that they have gone unanswered.
“Tesla has not responded to any of our requests,” she said. “From our standpoint they’ve ignored us — they haven’t responded to us.”
“And if those are not addressed and you’re making additional upgrades, that’s a problem,” she added.
Following an investigation into a 2018 crash that killed a driver when his vehicle slammed into a highway barrier, the safety board called on NHTSA to evaluate whether Tesla’s systems posed an unreasonable safety risk.
Homendy said NHTSA needs to take a more active role in the matter. The agency recently began requiring reporting on all crashes involving driver-assistance systems.
“It is incumbent on a federal regulator to take action and ensure public safety,” Homendy said. “I am happy that they’ve asked for crash information from all manufacturers and they’re taking an initial step with Tesla on asking for crash information on emergency vehicles. But they need to do more.”
On Twitter, a steady stream of videos from early beta tests have depicted the still-nascent Full Self-Driving system’s confusion at new obstacles. The system has been shown struggling with roundabouts and unprotected left turns, abruptly veering toward pedestrians, and crossing a double-yellow line into oncoming traffic.
In the latter case, the user wrote: “I want the best for Tesla, but going wide release is not the move, not right now at least.”
Others said they have suffered personally from Tesla’s rapid deployment of its software, and urged the company to reconsider.
Bernadette Saint Jean’s husband, Jean Louis, was killed in July on the Long Island Expressway when a Tesla believed to be using automated features struck him on the side of the road, a crash being investigated by NHTSA.
“Tesla should not be expanding its Autopilot or Traffic-Aware Cruise Control Systems until they can tell me why my husband and all of those First Responders had to die and be injured,” said Saint Jean, of Queens, in a statement through her attorney, Joshua Brian Irwin.