John Krafcik, chief executive of Waymo, the autonomous vehicle company created by Google’s parent company, introduces a Chrysler Pacifica hybrid outfitted with Waymo’s suite of sensors, including radar. (Paul Sancya/AP)

To help keep tabs on the safety of driverless cars rolling around U.S. cities, the federal government last year, and again last month, suggested that tech firms and car companies submit safety checklists.

None of the companies rushed to meet Washington’s wishes.

Now, Waymo, formerly Google's self-driving car project, has submitted a 43-page safety report to the Transportation Department, offering the most detailed description yet of how it equips and programs vehicles to avoid the range of mundane and outrageous problems that are part of driving in America.

“We’ve staged people jumping out of canvas bags or porta-potties on the side of the road, skateboarders lying on their boards, and thrown stacks of paper in front of our sensors,” according to the report, which was submitted Thursday and describes how company engineers use a 91-acre California test facility mocked up like a city, as well as computer simulations covering hundreds of thousands of variations of possible road scenarios.

The National Highway Traffic Safety Administration (NHTSA) has suggested a set of 28 “behavioral competencies,” or basic things an autonomous vehicle should be able to do. Some are exceedingly basic (“Detect and Respond to Stopped Vehicles,” “Navigate Intersections and Perform Turns”) and others are more intricate (“Respond to Citizens Directing Traffic After a Crash”).

Waymo lists an extra 19 examples of challenges it uses for testing, including that its cars must be able to “detect and respond” to animals, motorcyclists, school buses, slippery roads, unanticipated weather, and faded or missing road signs.

The company says it has used federal data on human crashes to focus its efforts on improving its software-and-sensor drivers. Top problem scenarios for flesh-and-blood drivers include rear-end crashes, turning or crossing at intersections, running off the edge of the road, and changing lanes. So those “figure prominently in the evaluation of our vehicles,” according to the report.

And then numerous permutations are generated from those scenarios. “We can multiply this one tricky left turn to explore thousands of variable scenarios and ‘what ifs?,’ ” the report says. “The scene can be made busier and more complex by adding . . . joggers zigzagging across the street.”

NHTSA said in a statement Thursday that Waymo is “the first company to make a voluntary safety self-assessment public.” While such reports are now voluntary, the House and Senate each passed bills that would require companies to submit safety assessments in the coming years.

Some road safety advocates argue that driverless cars should be required to pass specific safety tests before being put on the roads, just like human drivers. And they say the federal government has taken a dangerously laissez-faire approach to the burgeoning industry.

But with tens of thousands of people killed each year on U.S. roads, driverless-vehicle firms promise big improvements overall. Waymo executives say their safety report is part of an effort to be more transparent about their experiences, which they hope will be good for public understanding — and business.

“This overview of our safety program reflects the important lessons learned through the 3.5 million miles Waymo’s vehicles have self-driven on public roads, and billions of miles of simulated driving, over the last eight years,” Waymo chief executive John Krafcik wrote in a letter Thursday to Transportation Secretary Elaine Chao.

The report offered a view into how Waymo’s software breaks down the 360 degrees of data constantly pouring in from radar, laser sensors, high-definition cameras, GPS and an audio detection system the company says can hear sirens hundreds of feet away.

First is perception, which is where the vehicle classifies objects and stitches them into a “cohesive real-time view of the world,” the company said. That means distinguishing between cars and people, and also bicycles and motorcycles.

Next is modeling and predicting the behavior of the object it encounters. So, for example, the software knows that walkers move more slowly than bikers of either variety, but also that pedestrians can change direction abruptly.

Then the pieces come together in what the company calls its “planner,” which figures out where the car actually will go and is imbued with a “defensive driving” sensibility. It keeps the car out of the blind spots of nearby human drivers, gives cyclists extra room and games out what is coming several steps ahead of time.

But cars, like humans, cannot think of everything. How well they manage that reality — and deal with the unexpected — will help determine how good they really are.

“You can’t expect to program the car for everything you’re possibly going to see,” said Ron Medford, Waymo’s safety director and a former senior NHTSA official. Extensive driving experiments feed simulations that essentially provide the car with experience, which helps greatly, and what it learns is passed on to the entire fleet.

And if it really doesn’t know what to do, it can pull over safely, he said.