SAN FRANCISCO — Tesla vehicles running its Autopilot software have been involved in 273 reported crashes over roughly the past year, according to regulators, far more than previously known and providing concrete evidence regarding the real-world performance of its futuristic features.
Previously, NHTSA said it had probed 42 crashes potentially involving driver assistance, 35 of which included Tesla vehicles, in a more limited data set that stretched back to 2016.
Of the six fatalities listed in the data set published Wednesday, five were tied to Tesla vehicles — including a July 2021 crash involving a pedestrian in Flushing, Queens, and a fatal crash in March in Castro Valley, Calif. Some dated as far back as 2019.
Tesla Autopilot is a suite of systems that allows drivers to cede physical control of their electric vehicles, though they must pay attention at all times. The cars can maintain speed and safe distance behind other cars, stay within their lane lines and make lane changes on highways. An expanded set of features, called the “Full Self-Driving” beta, adds the ability to maneuver city and residential streets, halting at stop signs and traffic lights, and making turns while navigating vehicles from point to point.
But some transportation safety experts have raised concerns about the technology’s safety, since it is being tested and trained on public roads with other drivers. Federal officials have targeted Tesla in recent months with an increasing number of investigations, recalls and even public admonishments directed at the company.
The new data set stems from a federal order last summer requiring automakers to report crashes involving driver assistance to assess whether the technology presented safety risks. Tesla‘s vehicles have been found to shut off the advanced driver-assistance system, Autopilot, around one second before impact, according to the regulators.
The NHTSA order required manufacturers to disclose crashes where the software was in use within 30 seconds of the crash, in part to mitigate the concern that manufacturers would hide crashes by claiming the software wasn’t in use at the time of the impact.
“These technologies hold great promise to improve safety, but we need to understand how these vehicles are performing in real-world situations,” NHTSA’s administrator, Steven Cliff, said in a call with media about the full data set from manufacturers.
Tesla did not immediately respond to a request for comment. It has said that Autopilot is safer than normal driving when crash data is compared. The company has also pointed to the vast number of traffic crash deaths on U.S. roadways annually, estimated by NHTSA at 42,915 in 2021, hailing the promise of technologies like Autopilot to “reduce the frequency and severity of traffic crashes and save thousands of lives each year.”
Data pitting normal driving against Autopilot is not directly comparable because Autopilot operates largely on highways. Tesla CEO Elon Musk, however, had described Autopilot as “unequivocally safer.”
Musk said as recently as January that there had been no crashes or injuries involving the Full Self-Driving beta software, which has been rolled out to a more limited number of drivers for testing. NHTSA officials said their data was not expected to specify whether Full Self-Driving was active at the time of the crash.
The reports presents a new window into systems like Autopilot, but the database remains a work in progress — with many unknowns even in the raw data and questions left outstanding. The data does not lend itself easily to comparisons between different manufacturers, because it does not include information such as how many vehicle miles the different driver-assistance systems were used across or how widely they are deployed across carmakers’ fleets.
Still, the information gives regulators a more complete look than they had before. Previously, regulators relied on a piecemeal collection of data from media reports, manufacturer notifications and other sporadic sources to learn about incidents involving advanced driver-assistance.
“It revealed that more crashes are happening than NHTSA had previously known,” said Phil Koopman, an engineering professor at Carnegie Mellon University who focuses on autonomous vehicle safety. He noted that the reports may omit more minor crashes, including fender benders.
The data set doesn’t include every piece of information that would be helpful to know, but it could be an early indication of a focus on gathering more information and using that to improve technologies and safety regulations, said Bryant Walker Smith, a law professor at the University of South Carolina who studies emerging transport technologies.
“The promise of these, the potential of these is ultimately to make driving safer,” he said of the driver-assistance technologies. “It’s an open question whether these systems overall or individual systems have accomplished that.”
Companies such as Tesla collect more data than other automakers, which might leave them overrepresented in the data, according to experts in the systems as well as some officials who spoke on the condition of anonymity to candidly describe the findings. Tesla also pilots much of the technology, some of which comes standard on its cars, putting it in the hands of users who become familiar with it more quickly and use it in a wider variety of situations.
Several lawmakers weighed in on the report Wednesday, with some calling for greater investigation and possible safety standards for cars with the technology. Sen. Richard Blumenthal (D-Conn.) called the findings “cause for deep alarm.”
“It is a ringing alarm bell affirming many of the warnings that we’ve made over the years,” he said. “The frequency and severity of these crashes is a cause for yellow lights flashing and maybe red lights flashing on some of this technology.”
Blumenthal and Sen. Edward J. Markey (D-Mass.) have previously criticized Tesla for putting software on the roads “without fully considering its risks and implications.” On a call with media Wednesday, Markey called out Tesla’s assertion that Autopilot technology makes cars safer.
“This report provides further evidence slamming the brakes on those claims by Tesla,” he said.
The senators sent a letter to NHTSA, saying they were “deeply troubled” by the data and calling on the regulator “use all its investigative and regulatory authorities to shed needed light on this out-of-control industry and impose guardrails to prevent more deadly crashes.”
In the letter, the senators specifically pointed to the “staggering” 273 crashes that Tesla reported.
Driver-assistance technology has grown in popularity as owners have sought to hand over more of the driving tasks to automated features, which do not make the cars autonomous but can offer relief from certain physical demands of driving. Automakers such as Subaru and Honda have added driver-assistance features that act as a more advanced cruise control, keeping set distances from other vehicles, maintaining speed and following marked lane lines on highways.
But none of them operate in as broad a set of conditions, such as residential and city streets, as Tesla’s systems do. NHTSA disclosed last week that Tesla’s Autopilot is on around 830,000 vehicles dating to 2014.
Autopilot has spurred several regulatory probes, including into crashes with parked emergency vehicles and the cars’ tendency to halt for imagined hazards.
As part of its probe into crashes with parked emergency vehicles, NHTSA has said it is looking into whether Autopilot “may exacerbate human factors or behavioral safety risks.”
Autopilot has been tied to deaths in crashes in Williston and Delray Beach, Fla., as well as in Los Angeles County and Mountain View, Calif. The driver-assistance features have drawn the attention of NHTSA, which regulates motor vehicles, and the National Transportation Safety Board, an independent body charged with investigating safety incidents.
Federal regulators last year ordered car companies including Tesla to submit crash reports within a day of learning of any incident involving driver assistance that resulted in a death or hospitalization because of injury or that involved a person being struck. Companies are also required to report crashes involving the technology that included an air bag deployment or cars that had to be towed.
The agency said it was collecting the data because of the “unique risks” of the emerging technology, to determine whether manufacturers are making sure their equipment is “free of defects that pose an unreasonable risk to motor vehicle safety.”
Carmakers and hardware-makers reported 46 injuries from the crashes, including five serious injuries. But the total injury rate could be higher — 294 of the crashes had an “unknown” number of injuries.
One additional fatality was reported, but regulators noted it wasn’t clear whether the driver-assistance technology was being used.
Honda reported 90 crashes during the same time period involving advanced driver-assistance systems, and Subaru reported 10.
In a statement, Honda spokesman Chris Martin urged caution when comparing companies’ crash report data, noting that the firms have different ways to collect information. Honda’s reports “are based on unverified customer statements regarding the status of ADAS systems at the time of a reported crash,” he said.
Some systems appear to disable in the moments leading up to a crash, potentially allowing companies to say they were not active at the time of the incident. NHTSA is already investigating 16 incidents involving Autopilot where Tesla vehicles slammed into parked emergency vehicles. On average in those incidents, NHTSA said: “Autopilot aborted vehicle control less than one second prior to the first impact.”
Regulators also released data on crashes reported by automated-driving systems, which are commonly called self-driving cars. These cars are far less common on roads, loaded with sophisticated equipment and not commercially available. A total of 130 crashes were reported, including 62 from Waymo, a sister company to Google.
Waymo spokesman Nick Smith said in a statement that the company sees the value in collecting the information and said “any reporting requirements should be harmonized across all U.S. jurisdictions to limit confusion and potentially enable more meaningful comparisons, and NHTSA’s effort is a step toward achieving that goal.”
The automated-driving systems report shows no fatalities and one serious injury. There was also one report of an automated-driving crash involving Tesla, which has tested autonomous vehicles in limited capacities, though the circumstances of the incident were not immediately clear.
In the crashes where advanced-driver assistance played a role, and where further information on the collision was known, vehicles most frequently collided with fixed objects or other cars. Among the others, 20 hit a pole or tree, 10 struck animals, two crashed into emergency vehicles, three struck pedestrians and at least one hit a cyclist.
When the vehicles reported damage, it was most commonly to the front of the car, which was the case in 124 incidents. Damage was more often concentrated on the front left, or driver’s side, of the car, rather than the passenger’s side.
The incidents were heavily concentrated in California and Texas, the two most populous states and also the U.S. locations Tesla has made its home. Nearly a third of the crashes involving driver assistance, 125, occurred in California. Thirty-three took place in Texas.