A passenger rides in a pilot model of a self-driving Uber car in Pittsburgh. (Angelo Merendino/AFP/Getty Images)

Each day, driverless cars carry passengers around U.S. cities big and small. But federal officials — driven by bipartisan concerns about stifling a promising industry or seeming too old-fashioned — have not imposed any new safety requirements.

On Tuesday, the Trump administration weighed in with its first set of suggestions for how autonomous vehicles should be managed. It continues — and, in several significant ways, extends — the generally hands-off approach taken under the Obama administration, which released the first set of voluntary guidelines last year.

Transportation Secretary Elaine Chao announced the “2.0” version of the federal policy in Ann Arbor, Mich. The guidelines continue to rely on technology companies and automakers to voluntarily submit information explaining why their cars are safe and how their passengers will be protected.

Under President Barack Obama, the policy was built around a 15-point safety checklist, covering areas such as crashworthiness, how cars are meant to respond to hazards and where they are designed to drive. Under President Trump, several key areas were dropped from the list, including privacy and ethical considerations.

Those were removed because they were “speculative in nature and outside” the authorities of the National Highway Traffic Safety Administration, according an explanation of the change that accompanied the revised guidance. “These are important areas for further discussion and research, but it would be premature to include those considerations in this document,” according to NHTSA.

The new guidance also repeatedly emphasizes the fact that it is voluntary and says the Transportation Department “strongly encourages” states not to make elements of it mandatory or step into vehicle safety matters controlled by the federal government.

Earlier this year, California officials proposed requiring that companies provide them with copies of the voluntary letters the firms submit to NHTSA.

A NHTSA official said it is “a much cleaner and streamlined approach” to make companies responsible for releasing their own letters, rather than having them come through federal safety officials, which left the mistaken impression their content had to be approved in Washington.

“It’s all on them to make it public,” said the official, who spoke to reporters under ground rules requiring he not be named. “Now it’s more of a question to the companies: Did you make your assessment public? And, if not, why not?”

The federal government’s largely laissez faire approach has come off without major problems. While a deadly Tesla crash raised questions about the safe use of partially automated vehicles and the danger of drivers being lulled into complacency, there have been no known U.S. fatalities in cars designed to do all the driving. But some consumer safety advocates have warned that safety oversight is lacking.

Missy Cummings, who heads Duke University’s Humans and Autonomy Lab, said companies should be required to meet basic safety standards.

“We were already talking about voluntary requirements. Now we’re trying to relax the voluntary requirements to be even more relaxed,” Cummings said. At a minimum, she said, companies putting driverless cars in use should have to guarantee their cars can detect people on the side of the road, including state troopers, construction crews and motorists changing tires.

Advocates for voluntary guidelines argued for the safety potential of the technology.

“Since the Department of Transportation was established in 1966, there have been more than 2.2 million motor-vehicle-related fatalities in the United States,” Chao said in a statement. Automated driving systems “have the potential to significantly reduce highway fatalities by addressing the root cause of these tragic crashes,” she said.

General Motors commended the policy update, saying it “provides clear, streamlined, and flexible guidance for the safe and responsible design, manufacture, and deployment of self-driving vehicles.”

From San Francisco to the Phoenix area to Pittsburgh, tech companies and carmakers have tested the mettle of their young robo-cars on public roads, working out performance kinks along with human drivers. But unlike human drivers, who have to pass a driving test, driverless vehicles are not required to meet specific safety standards concerning automation.

Tech companies and many state officials argue that current federal law allows companies to replace drivers with algorithms and sensors as long as the basic machinery of the cars follows existing vehicle safety standards. That means autonomous cars can tool along highways as long as they have a steering wheel, even if it’s just there largely for decoration.

General industry practice among companies developing driverless technology is to have safety drivers sitting behind the wheel as chaperons, ready to reach in and seize control if the systems freeze. That is also good for research purposes and, presumably, to lower liability in case of a problem. But some states have argued there is no legal requirement for human drivers to be there.

Congress has begun wading in.

The House last week passed a bipartisan bill addressing key concerns of automakers and tech companies, while also taking on safety questions.

The House bill instructs Chao to require “safety assessment certifications” that demonstrate driverless cars “are likely to . . . function as intended and contain fail safe features.” That would have to be done within two years.

The bill also prevents states from regulating “the design, construction, or performance” of automated vehicles, saying that power rests in Washington. Driverless developers have been largely unified in opposing what they call a “patchwork” of regulations that they say would stymie the industry and undercut interstate travel.

The Senate released a staff draft last week that showed debate is continuing on key areas, including on how to treat commercial trucking, liability issues and how far the federal government should go in preempting states from weighing in with their own requirements.