SAN FRANCISCO — Federal investigators are stepping up their probe of Tesla Autopilot, the driver-assistance system that comes equipped with the electric vehicles, as they evaluate the feature’s role in repeated crashes with parked emergency vehicles.
The agency began evaluating the issue in August 2021, following nearly a dozen crashes under similar circumstances, which included stationary emergency vehicles such as ambulances and police cruisers, some in low-light conditions. The agency has identified 15 injuries and one death involved in the crashes, according to documents posted by the agency.
Tesla Autopilot automates some driving features, such as keeping vehicles in their lanes, making lane changes and maintaining safe distance between other vehicles. Though it allows owners to hand over some of the driving tasks to their vehicles, they are required to pay attention at all times — and their cars seek to monitor whether they are engaged.
Tesla did not immediately respond to a request for comment.
Tesla has previously touted the safety of Autopilot, describing it as safer than normal driving when crash data is compared. Tesla CEO Elon Musk has called it “unequivocally safer.” Musk has also taken aim at federal auto safety regulators over the need to issue repeated recalls, some of which were encouraged by NHTSA.
NHTSA, in its opening filing on the upgraded investigation, said it was able to step up its probe after identifying patterns across the various crashes — which it deemed worthy of further examination. The probe encompasses approximately 830,000 vehicles dating back to model year 2014, spanning Tesla’s current range of models.
“The investigation opening was motivated by an accumulation of crashes in which Tesla vehicles, operating with Autopilot engaged, struck stationary in-road or roadside first responder vehicles tending to preexisting collision scenes,” the agency said in its report this week.
Each of the crashes occurred on “controlled-access” highways, it said. Drivers would have been able to spot the emergency vehicles, such as ambulances or police cruisers, an average of eight seconds before the crash, NHTSA said. But in the cases of the Autopilot crashes, “no drivers took evasive action between 2-5 seconds prior to impact.”
NHTSA said it is examining whether Autopilot “may exacerbate human factors or behavioral safety risks.”
Tesla last year issued a remote software update to better detect emergency vehicles in low light. The update, issued to Tesla’s fleet of internet-connected cars as a software update, drew NHTSA’s attention after Tesla aimed to better equip the cars to detect emergency vehicles — without filing the requisite paperwork to inform regulators of the change.
Tesla has the ability to remotely issue software updates that change how its vehicles behave, sometimes patching safety issues in the process. But regulators’ insistence that it must publicly spell out those changes has ushered in a spate of recent recalls, which led Musk to decry NHTSA as the “fun police” this year.
NHTSA’s engineering analysis will examine 16 crashes, more than the 11 the agency had initially identified when it opened its probe last August — after officials identified additional incidents that fit the description during the course of their evaluation.