Another day, another crash involving a Tesla.

On Wednesday, Tesla said one of its customers in Beijing was caught unprepared last week when his car, which had autopilot enabled, sideswiped another vehicle that was partially parked off the side of the road. The accident caused some damage, according to Tesla, but nobody was hurt.

Tesla's analysis of the vehicle data logs showed that the driver didn't have his hands on the wheel. The driver doesn't appear to dispute that account, accusing Tesla salespeople of misleading him into thinking that the car had fully self-driving capabilities.

Here's how Tesla described the crash in a statement to The Washington Post:

The customer’s dash cam video shows that the Tesla was being driven on a highway in China where a vehicle was parked on the left shoulder straddling the left lane. The Tesla was following closely behind the car in front of it when the lead car moved to the right to avoid hitting the parked car. The driver of the Tesla, whose hands were not detected on the steering wheel, did not steer to avoid the parked car and instead scraped against its side. As clearly communicated to the driver in the vehicle, Autosteer is an assist feature that requires the driver to keep his hands on the steering wheel at all times, to always maintain control and responsibility for the vehicle, and to be prepared to take over at any time.

Given the number of Tesla accidents that have made the news in the last three to four months, it's vital that drivers fully understand what Tesla's autopilot can and can't do. It's not like one of Google's driverless cars, where engineers hope passengers will someday be able to push a button, sit back and enjoy the ride. Tesla's autopilot is much closer to a form of advanced cruise control.

The autopilot feature was rolled out in October 2015. But unlike a fully driverless car — which the government would classify as "Level 4 automation" — Tesla's autopilot is designed to keep you within a lane, and from hitting other cars, under a limited set of circumstances. Even extreme heat or a misaligned bumper can be enough to throw off the system.

Of the two companies, only Tesla is performing its tests on real-world customers, a decision that helped lead to that fatal Florida Tesla crash in May.

Tesla is generally up front in its marketing and user interface these days, reminding people that autopilot is a driver-assist feature, not a driver automation feature. For instance, to enable the automatic steering feature on a Tesla, drivers first have to click through a warning that appears on their car's screens. The language looks like this:

Autosteer feature is currently in Beta: Autosteer is an assist feature that requires you to keep your hands on the steering wheel at all times. It is designed for use on highways that have a center divider and clear lane markings. It is also appropriate on other kinds of roads while in slow-moving traffic. It should not be used on highways that have very sharp turns or lane markings that are absent, faded, or ambiguous. Similar to the autopilot function in airplanes, you need to maintain control and responsibility for your vehicle while enjoying the convenience of Autosteer.
Please refer to the Owner’s Manual for more detailed information about this feature.
Do you want to enable Autosteer while it is in Beta?

In its owners' manual, Tesla also lists a number of conditions that may thwart the autopilot system. Click here for a note that will appear beside this story that shows what conditions are.

At this point, you could say that anyone who fails to operate their autopilot according to the manufacturer's instructions is willfully throwing caution to the wind. But some analysts say that Tesla's marketing and technology may legitimately be responsible for creating misunderstanding among certain drivers.

For example, Tesla told The Post that its autopilot does not notify drivers with an audio and visual warning when they take their hands off the wheel. It only does so when the autopilot itself becomes less than confident about its ability to function safely. According to Karl Brauer, an auto analyst at Kelley Blue Book, that's unusual for the rest of the industry, whose advanced cruise controls tend to sound the alarm mere moments after a driver's hands lift off. Tesla's process, he said, may end up encouraging some drivers to let their attention drift more than other car owners.

"None of the other car companies besides Tesla seem to be having the same problem," Brauer said. "Maybe it's a big media overreaction. Maybe it is happening with everyone else. But we're not hearing about it. We're not hearing about Honda CR-Vs crashing, or Mercedes-Benzes crashing."

Like Tesla and Google, countless automakers are investing heavily in automation technology, whether partial or full. A recent update to Ford's adaptive cruise control, for example, lets the car automatically handle stop-and-go traffic.

Tesla's chief executive, Elon Musk, has said that fully self-driving cars are coming much sooner than many people think. We may even get fully automated Teslas within a couple of years -- all of which highlights how Tesla's current autopilot technology is not the same as full automation.

Still, to many of us who don't work in the automotive or technology industries, it can be hard to tell the difference. That's why some think even calling Tesla's autopilot, well, "autopilot" is a big mistake.

"By marketing their feature as ‘Autopilot,’ Tesla gives consumers a false sense of security," Laura MacCleery, a vice president at Consumer Reports, said last month. "We're deeply concerned that consumers are being sold a pile of promises about unproven technology."

Even some who drive professionally for a living have argued that Tesla's autopilot branding risks misleading consumers.

"With a name like that, is it any wonder that Tesla drivers might assume that the semiautonomous driving system can just take over?" wrote CNNMoney's autos writer, Peter Valdes-Dapena, last month. Although Tesla puts its warnings in the owners' manual, he added: "How many drivers ever read the owner's manual? Based on most of the people I've ever talked to in the auto industry, very few."

Even if a driver has read the manual, Tesla's marketing surrounding its autopilot feature makes it easy to blame the technology when a crash occurs, even if the crash data appear to demonstrate otherwise. That's the tack that this driver in Beijing appears to be taking. Last month, a Tesla driver in Pennsylvania accused the autopilot for causing a crash even though the feature appeared to be inactive at the time. And in another incident this summer, Tesla's logs undermined claims by a Model X owner that his car accelerated by itself into the side of a shopping mall.

In all of these cases, as will become increasingly common in the future, investigators have looked closely at data recordings that show exactly what a car was doing in the moments before a crash. The data can help analysts pinpoint whether a driver had their hands on the steering wheel, how far the accelerator was being pressed, whether the brakes were applied and other information.

Although authorities on the scene may perform an initial assessment of a crash and write a ticket or make an arrest right away, the data from the recorders can improve the odds of correctly assigning blame, analysts say.

Humans tend to trust technologies pretty quickly once they actually get a taste of it and understand what's possible with it. In the automotive space, we don't have to reach that far back in history to find a comparable example.

"When anti-lock brakes first came on the scene as a technology, insurance companies were frustrated because their actuarial tables showed that people with anti-lock brakes cars were getting in more accidents," said Brauer.

The results were deadly: A 1996 study backed by the insurance industry found that people were 45 percent more likely to die in a car with ABS than in a car without the new technology. While these findings understandably led to some initial alarm, the reason for the increased crashes ultimately boiled down to people becoming too comfortable too quickly with a technology they didn't understand.

"Not only were they driving less carefully with anti-lock brakes on, but they weren't [educated] to properly utilize them when they needed them," said Brauer.

That problem has since been overcome. But it took a great deal of driver education — informing drivers not to expect too much of their anti-lock brakes — and some revisions to the technology, such as eliminating the rapid vibrating sensation in the brake pedal associated with ABS that can surprise drivers.

So to see some drivers willingly giving their lives over to a feature that calls itself "autopilot" comes as little surprise. The very word conjures images of airliners that fly gracefully from point A to point B, all by themselves. Although the reality is often more complicated — aircraft autopilot is turned on only under the appropriate conditions, and a real-world human pilot still must be prepared to take over if something goes wrong. That's a detail we can be all too willing to overlook.