On June 30th, 2016, the National Highway Traffic Safety Administration (NHTSB) opened an investigation on Tesla Motors due to a fatal crash involving a Tesla Model S. The issue is the car's Autopilot software was enabled and driving the car. The autonomous system missed a truck crossing the highway perpendicular to the Tesla's direction of travel. As a result, the Tesla's passed under the truck's trailer. Presumably, the parts of the trailer that went through the Tesla's windshield resulted in the driver's death.
Tesla's blog post on the incident states, in part:
These averages are so broad the comparisons are meaningless.
For example, Tesla quotes (without citation) that among all vehicles in the US, there is a fatality every 94 million miles. The "all vehicles" category presumably includes buses and farm equipment. Does comparing the fatality rates of mopeds and Tesla cars make sense? Furthermore, Tesla's Model S has a suggested retail price of $70,000 USD, which is hardly representative of average vehicles. What is the fatality rate for vehicles comparable to Tesla's Model S? Are the comparable vehicles similar enough to reach meaningful conclusions? What do different vehicle classes look like? How are the associated driver populations for each category best described? Are populations associated with higher fatality rates likely to adopt autonomous driving systems in the first place?
Recall the fatality per mile traveled averages include the effects of negligent drivers. Under this light, Tesla's own numbers are questionable at face value. Specifically, according to the CDC, 31% of the US driving related fatalities involve alcohol impairment. Thus, sober human drivers cause one fatality per 136 million miles traveled, instead of the 94 million miles quoted. The CDC data also indicates a further 16% crashes involve drugs other than alcohol, but the number of resulting fatalities could not be clarified for those collisions with available data.
Tesla's post compares fatality per mile averages. However, Tesla's total of 130 million miles traveled pales in comparison to the number of miles driven to arrive at the CDC averages. It looks like the sample sizes differ by several orders of magnitude. Is Tesla's sample size really enough to reach accurate conclusions? Tesla seems to think so, thus an assessment is fair game. The above numbers show the Autopilot software compares favorably to possibly driving under the influence of intoxicants. Moreover, Autopilot also compares as roughly equivalent to an average driver: likely speeding, perhaps reckless, in some cases impaired by drugs other than alcohol. Altogether, Autopilot is not a good driver.
As a side note, when Tesla invokes fatality per miles traveled averages, the implication is that Tesla's Autopilot is better than average and hence good. But most drivers incorrectly believe themselves better than average. It follows the average driver underestimates what it takes to be a good driver. Tesla's statement could be setting up average readers to deceive themselves by tacitly appealing to their illusory superiority.
But back to the story. What are the self-driving car performance expectations? This June 30th CNN article states, in part:
The year 2020 is basically just around the corner. Today, drivers feel compelled to forcibly take control from autonomous driving systems alarmingly often. This LA Times article from January 2016 states, in part:
Both the information and the questions required for good understanding are missing. Would you be comfortable being driven by someone who misses branches, or just fails to drive at all, as frequently as once a mile? Are the driving conditions for those driven miles reported by Google and others realistic? What proportion were driven in snow, ice, heavy rain, fog, or smoke? Did autonomous driving systems encounter geese, ducks, or deer on the road? How do those systems handle emergency situations?
Suppose you will never let beta software drive you around. What happens when you are affected by someone who does? Back to the CNN article,
Given this expert assessment, what does the lack of Tesla disengagements in California DMV's report mean? That Tesla's software is just better? That the average Tesla driver is less engaged? Does the Tesla crash suggest so-so software is duping drivers into not paying attention?
But even if the timely warning was possible, are driving autopilots a good idea in the first place? In aviation, autopilots do most of the flying and as a result human pilots' ability to fly by hand is compromised. Recovering flight emergency situations often requires manual flying, which is not the time to discover those skills are lacking. Specifically, the professional recommendation is:
In contrast, driving autopilots are promoted for heavy use, and especially for low-workload driving scenarios. The ideal situation often casts the driver as a self-absorbed mass transit passenger:
Note the irony of "progress" illustrated by reading a paper (!) book, comfortably sitting with all driving controls out of reach. And there is more than one irony in play: studying from paper rather than a tablet is associated with better comprehension and retention of the material. Paper also leads to better results than a Kindle. Why does this picture associate technological improvement with paper books?
Of course the flying environment is very different from the driving environment. Kids and pets don't run in front of the plane from behind a row of clouds. Plane collisions are infrequent due to spacious traffic control enforced with multiple radars. And if you listen to plane mishap recordings, you will notice bad situations develop over comparatively long periods of time. Limited flight autopilot failures can be tolerated because the entire flying environment is engineered to catch problems before they become unsurmountable.
In contrast, there is no car equivalent to the cockpit's emergency procedure binder. The driving experience still requires quick judicious action. Experience with flight autopilots show excessive dependency can result in compromised pilot skills. So why should the professional advice for planes, with all the implied liability and gravitas, be any different in nature for cars? And if unattentive drivers do not have the time to recover from an undesired state, why should drivers stop paying attention in the first place? Tesla's own advice agrees: "be prepared to take over at any time".
For clarity's sake, maybe Tesla's "Autopilot" is better described in terms of "Driver Assist".
For time's sake, maybe traffic and community planning is a better way to curb hours wasted while driving. As an example, shutting down container shipping terminals increases truck traffic. Trucks disproportionately wear roads because pavement damage is proportional to the fourth power of weight --- each truck axle carrying 18,000 pounds is equivalent to 10,000 cars. So, more truck traffic means more road work, which in turn causes even more congestion. Self-driving vehicles can be irrelevant to traffic.
For everyone's sake, maybe a characterization of the behaviors correlated with fatalities could lead to keeping drivers exhibiting those behaviors off the road. Ignition interlocks prevent drunk driving 70% of the time according to the CDC, but of course this is invasive for the sober majority. So instead, a reasonable non-invasive Driver Assist feature could detect unfit driving. Critically, this approach does not require developing a fully fledged autonomous driving system to be, in some respects, perhaps just as helpful.
Update: it looks like common sense is finally catching on --- the NTSB says fully autonomous cars probably won't happen, many proponents of the technology are running into trouble and/or scaling back expectations, and Tesla just disabled its Autopilot system.