Video released by Tempe, Ariz., police graphically shows how both an autonomous Uber SUV and its backup driver failed to protect a pedestrian who was struck and killed as she walked a bike across a spottily lit thoroughfare.

The pedestrian, 49-year-old Elaine Herzberg, was crossing the street outside the crosswalk Sunday night when she was hit, according to police.

Behind the wheel was a human chaperone who was supposed to be backstopping Uber’s developing technology. But the Uber employee, Rafaela Vasquez, 44, repeatedly took her eyes off the road in the run-up to the deadly collision, onboard video shows, raising questions about whether any distractions might have been at play.

Police have said the Volvo XC90 was traveling at about 40 mph when it struck Herzberg. It is the first known death involving the testing of driverless vehicles. The National Transportation Safety Board is investigating.

Uber would not say Thursday whether Vasquez was on a cellphone, whether she was following company procedures before the collision, or whether she remains an Uber employee. Testing of the company’s driverless fleet remains suspended, an Uber spokeswoman said, “so no vehicle operators, including this one, are on the road.”

Vasquez pleaded guilty to attempted armed robbery in 2000, according to Maricopa County Superior Court records. Uber declined to comment.

The video shows Vasquez with a sudden look of shock on her face before hitting Herzberg, and she tenses up, apparently trying to seize control of the vehicle. Police have said there were no “significant signs of the vehicle slowing down.” Uber would not say if or when the sensor-packed SUV’s multiple laser, camera and computer systems detected Herzberg, or if the car’s brakes were applied.

Uber declined to comment on the video, saying it did not want to prejudge the investigation.

Tempe police said the investigation is continuing and would not say whether Vasquez was distracted by something in the vehicle.

Self-driving Ubers with backup drivers have carried passengers on 50,000 rides in Arizona and in Pittsburgh, the company said. Customers call for a car with the app as usual, and are notified if they happen to be selected for an autonomous ride. They are charged the normal Uber X fare, and can decline the self-driving experience if they want a human to do the driving, the company said.

Uber would not comment on the precise procedures and instructions for back-up drivers but said “the standard protocol is to be hovering and be ready to intervene as needed.” Backup drivers have three weeks of training in classrooms, on closed courses and on public roads with a backseat driver coaching them, the company said.

But that training can butt up against human nature and personal responsibility.

Many developers of driverless technology say humans can easily be lulled into a false sense of security as they putter along city streets, leaving them mentally unprepared to suddenly seize control of the wheel if they are needed.

“If humans become over-reliant on the technology, and yet they still have a role to play in safe operations of the vehicle, that is absolutely a risk factor,” said Deborah A.P. Hersman, president of the National Safety Council and former chairman of the National Transportation Safety Board.

Some tech firms and carmakers are pushing for cars with no steering wheels to avoid this middle ground where people are expected to backstop technology. California has required the presence of such safety drivers, though starting next month firms can apply for a permit not to use them if certain requirements are met.

Arizona, which has touted its limited regulations as a competitive advantage in the burgeoning industry, does not require the drivers. Uber had used them in Arizona anyway because, some outside observers said, they wanted an extra layer of safety as their technology matured.

The vehicle that hit Herzberg “was being supervised. There’s a reason there’s a safety driver there, because Uber wasn’t confident of the performance of that vehicle under those condition at that time,” said Bryant Walker Smith, an assistant professor of law at the University of South Carolina and an authority on autonomous cars.

In a marketing video issued in October 2017, Uber boasted of its rigorous operator training and the safety capabilities of its vehicles.

“Our vehicle operators are extensively trained to handle everything from a thunderstorm to a gaggle of geese crossing the road, so you can ride comfortably knowing that our team is committed to keeping you safe,” said the narrator of the video, posted to Uber’s website.

It noted the value of their human chaperones.

“Then there’s Ryan,” the video continued. “He’s what we call a vehicle operator and he’s here to make sure the vehicle does exactly what it’s supposed to do. But before Ryan could hit the road, he had to hit the books. He’s one of hundreds of vehicle operators who’ve passed test after test in the classroom and out on the track. These tests teach operators and vehicles to expect the unexpected – like swinging car doors, pedestrians and unusual roadways.”

Hersman said the crash raised questions about what fail-safes are in place to address potential equipment defects in self-driving cars.

She drew a parallel to railroad grade crossings, where the default response is for the crossing arms to descend if certain equipment is not working.

“There needs to be notice that a system is not functioning or not operational,” she said. “And there needs to be a safe mode.”

Precisely when such a mode would kick in should be the subject of additional research covering design flaws and limitations of driverless technology, she said.

Researchers Magda Jean-Louis and Eddy Palanzo contributed to this report.