The Washington PostDemocracy Dies in Darkness

‘Elon Musk’s Crash Course’: 3 key arguments from the Tesla documentary

“Elon Musk's Crash Course” dives into the billionaire's business with Tesla. (FX)
Placeholder while article actions load

By now, everybody knows the two-pronged promise Tesla has been making for nearly a decade: The car company aims to revolutionize both cars’ relationship to the environment (through gas-free electric power) and consumers’ safety on the roads (through self-driving capabilities). Tesla CEO Elon Musk has long been fond of emphasizing that traffic deaths would decrease if driving weren’t in the all-too-human hands of, well, the driver — and has promised that one day, traveling by car will be like taking an elevator. “You’ll tell it where you want to go, and it takes you there with extreme levels of safety.”

The cars are certainly electric. That second objective, a new documentary argues, has proved more elusive.

Informed by the reporting of the New York Times’ Cade Metz and Neal Boudette, director Emma Schwartz’s “Elon Musk’s Crash Course” raises a skeptical eyebrow toward Tesla’s vaunted Autopilot feature, sometimes described as its self-driving software. It maintains that Autopilot hasn’t lived up to its promise and that lives have been endangered as a result. Here are three key arguments Schwartz’s film puts forth.

1. Despite Tesla’s claims that its technology would revolutionize cars for the safer, its cars have sometimes failed to recognize certain safety threats while in Autopilot mode — and Tesla drivers have had fatal road accidents while using it.

According to “Elon Musk’s Crash Course,” an investigation in 2016 by the National Highway Traffic Safety Administration (NHTSA) found that some 38 Tesla crashes had taken place in the United States while the cars were in Autopilot mode, but the film details three in which drivers were killed.

The first is that of Josh Brown, a bomb dismantler for the U.S. Navy in the Iraq War and the founder of a company that aimed to extend Internet service into rural America. Described by his friends as a passionate tech enthusiast, Brown loved his Tesla and often filmed videos behind the wheel. When Musk retweeted one such video in April 2016, in which the car in Autopilot mode steered itself out of the way of a truck merging too aggressively, Brown was elated.

Brown was driving on the same mode through Williston, Fla., after leaving Disney World the following month when his Tesla drove under a tractor-trailer without slowing down. Brown, 40, was killed in the collision. (Despite rumors that Brown had been watching a movie, the documentary makes clear that no movies were found on Brown’s laptop. Still, NHTSA and the National Transportation Safety Board, or NTSB, found that Brown was at fault because he was not paying attention to the road.) In the film, Musk is heard in an audio recording saying later that radar upgrades that were added to the Autopilot software after Brown’s accident might have saved Brown’s life.

Elon Musk says Tesla's 'Full Self-Driving' is debatable

In March 2018, 38-year-old Apple engineer Walter Huang died when his Tesla, running in Autopilot mode, hit a concrete barrier in Mountain View, Calif., at over 70 mph. Former NTSB chairman Robert L. Sumwalt says on-screen that Huang was found to have been playing a video game.

And in March 2019, Jeremy Banner, 50, was killed in another Florida highway accident, nearly identical to the one that killed Brown. The Tesla was on Autopilot when a tractor-trailer pulled across the road. Banner’s car failed to recognize the side of the vehicle in the bright sunlight and went underneath it, shearing off the roof.

Sumwalt alleges in “Crash Course” that Tesla has ignored its safety recommendations after crashes. “When innovation is implemented, we have to make sure it’s done safely,” he says, “or it’s going to be the Wild West out there.”

2. Some former engineers at Tesla privately harbored doubts about Musk’s promises to the public about the Tesla’s ability to self-drive.

Despite Musk’s claims starting in 2015 that self-driving cars were essentially a “solved problem” and that the kinks were merely being worked out, multiple former staffers allege in “Crash Course” that wasn’t the case behind closed doors.

They say, for example, that certain decisions were made somewhat arbitrarily — such as the decision to use cameras instead of a popular radar system known as lidar. “There was no deep research phase where various vehicles were outfitted with a range of sensors. Many team members would have liked that,” says Akshat Patel, Autopilot’s engineering program manager from 2014 to 2015. “Instead, the conclusion was made first, and the test and development activities began, to prove that conclusion correct.”

Others allege that they worried the Autopilot technology was being sold to and used by people who believed it would provide the same elevator-like transportation experience Musk had once described — drivers who believed they could get in, provide a destination, then sit back and relax. When Brown’s crash happened, “I was aware that people were trusting the system to do things that it was not designed or capable of doing,” says JT Stukes, senior project engineer at Tesla from 2014 to 2018. “The fact that that sort of accident happened is obviously tragic. But it was going to happen.”

Raven Jiang, an engineer who also worked on Autopilot at Tesla from 2015 to 2016, notes that in the same time frame, Elizabeth Holmes’s transgressions at Theranos were being revealed to the public. “Some of those stories were at the back of my mind,” Jiang says. “It definitely made me question a lot more about what’s behind some of this public optimism.”

3. Tesla enjoys substantial public support anyway.

The most recent footage included in “Crash Course” comes from just last month. Musk, wearing a black cowboy hat and black aviators, grins onstage in front of a whooping, enraptured crowd at the launch party for Tesla’s new Gigafactory in Austin. Partygoers hold their phones up to film him speaking — a stark reminder that Musk is a megacelebrity and a hero to many.

Tesla test drivers believe believe they’re on a mission to make driving safer for everyone. Skeptics say they’re a safety hazard.

One Tesla owner, Alex Poulos, points out that Musk superfans sometimes call themselves “Musketeers.” Kim Paquette, another Tesla owner who’s part of an elite group that test-drives new versions of the self-driving software, shows off her collection of HotWheels-size Teslas and says she’s “honored” to participate in the testing process. “People who buy a Tesla understand that it’s not self-driving yet,” she says. Even Brown’s family says that “part of Joshua’s legacy is that the accident [that caused his death] drove additional improvements, making the new technology even safer,” in a statement read on their behalf at a building dedication for him. “Our family takes solace and pride in the fact that our son is making such a positive impact on future highway safety.”

And yet, Poulos says, “Full self-driving, that’s what I paid for and I don’t have it. It’s right there in the name of it, right? And I don’t think that’s fair to say.

“Musk, I think he has a huge responsibility,” he adds. “I think he needs to be a little bit more cautious about what he tells his followers.”

Loading...