There’s been a lot of news (and misinformation) over the past few days about Tesla Autopilot (AP). Primarily about the NHTSA opening a Preliminary Evaluation of AP performance in relation to a crash on May 7 in Florida that resulted in a man losing his life. More recently there was a crash in Pennsylvania with the driver stating that AP was engaged as well as one this past weekend in Montana that was blamed on AP.
Autopilot – Driver Assist
Tesla Autopilot is the marketing name given to a suite of current and planned capabilities. Today this includes two driving features:
- Traffic Aware Cruise Control (TACC) is basic cruise control along with the ability to follow a vehicle in front at some specified distance. This is not substantially different than similar systems in other cars except for operating a bit more smoothly.
- Autosteer is a feature that will, when lane lines are clearly present, steer the car to stay within the lines and will change lanes when the driver instructs it to do so using the blinker. This is the feature that is different than other manufacturers capabilities and where Tesla is quite advanced compared to others. A recent Motortrend article included this chart:
AP is, as of today, a driver assist feature. With current capabilities it is not intended to autonomously drive the car in most situations and this is made clear to Tesla drivers.
- AP capability is turned off by default. When it is turned on the driver must acknowledge a statement regarding AP capabilities and limitations.
- Autosteer is provided as a Beta feature. This is made clear in the acknowledgement statement above and elsewhere.
- Each time AP is engaged the driver is given a warning to keep their hands on the wheel.
- While engaged AP will ‘nag’ the user every few minutes to place their hands on the wheel as a reminder that they, and not AP, are still in control of the car. This is not a simple touch but must be enough to provide some bit of resistance to Autosteer. If the driver doesn’t do this then audio in the car is turned off and an alarm will sound. If the driver continues to not respond (they’ve fallen asleep, had a heart attack, etc.) then the car will turn on emergency blinkers, slow, and attempt to pull off to the side of the road and stop.
A good driver-assist corollary might be the basic cruise control that most cars have. Like AP, cruise control is not intended to stop the car if there is another car or object in front of it. If someone says that they were using cruise control and then blame cruise control for their rear-ending the car in front of them because they neglected to use their brakes then most people would ask why they weren’t paying attention and why they didn’t use their brakes themselves. Tesla AP is just the same.
AP does NOT automatically stop the car if there is an object in front of it. If you are following a car using TACC and that car slows to a stop then Tesla will come to a safe stop as well. However, if you are using AP or TACC, not following another car, and approach a junction with stopped cars in front of you, then Tesla will often not apply the brakes in time, nor is it expected to do so. Its forward view does not extend far enough to do this. Just because auto steer is enabled doesn’t mean that TACC has any more capabilities.
Elon Musk statement on ‘Beta’: http://electrek.co/2016/07/11/tesla-autopilot-beta-elon-musk-1-billion-miles-data/
Emergency Braking. AP equipped Tesla’s also include an emergency braking capability. This is enabled by default and is not a part of Autopilot. It is intended to apply the brakes to “reduce the impact of an unavoidable frontal collision”. Emergency braking is just as it says, for emergencies. It is not intended to replace driver responsibility for paying attention and applying the brakes when necessary. Emergency braking is a last resort safety feature to help lessen some impacts when possible.
Two issues with respect to the Florida crash; emergency braking is disabled above 85 mph, and it uses a narrow (top to bottom) field of view as otherwise it could be triggered by overhead signs. In this latter case the semi trailer was likely above it’s field of view.
From Tesla’s Blog:
This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles. It is important to emphasize that the NHTSA action is simply a preliminary evaluation to determine whether the system worked according to expectations.
Road fatalities are, sadly, not at all unusual. 411 people were killed by drivers in Minnesota last year and there were 38,300 deaths on U.S. roads in 2015.
Based on the data above there are about 11 fatalities per billion vehicle miles travelled (VMT) in the U.S. and Tesla Autopilot is now 7.7 fatalities per billion. Statistically there should have been at least one death of someone using Autopilot before now. Advantage Tesla and potentially 3 lives saved thanks to Autopilot.
Two caveats though. This is a single data point and that does not a very accurate statistic make. On the other hand, that thousands of average people across the world have driven 130 million miles on autopilot without a death until now does lend considerable credence to arguments that driving with autopilot is safer than without.
Second, some big chunk (30%?) of those 130 million miles were on much safer European roads that have a fatality rate of 6.2 fatalities per billion VMT. This would bring Tesla autopiloted cars closer to average though still a bit above. What we can conclude is that AP at this point appears no more dangerous than driving without and is likely safer.
Next we have to consider if Autopilot was in any way at fault. The NHTSA’s Preliminary Evaluation is a first step in this. It will determine if a full investigation is warranted. Given how new and important this technology is and how advanced Tesla’s capability is, if it were me I would say yes right off the bat simply for the valuable information that might result. We’ll have to wait to see what NHTSA says.
Whatever happens it will be a while before we know anything very concrete, which dents the credibility of journalists and news anchors making ignorant pronouncements without good information.
Some things to consider.
1) The road design, like many in the U.S., may be inherently dangerous. There is a crest in the road approaching the intersection that can make it difficult or impossible for drivers heading east (the Tesla in this case) to see vehicles crossing at NE 140th Ct (such as the semi truck) and vice versa. Some locals have said that even driving the speed limit you have little time to react if there is a vehicle crossing there and they then point out that most traffic is going 20+ mph over the speed limit on its very wide straight interstate-like lanes.
This may be a good example of U.S. traffic engineers theology that if only everyone obeyed every law and regulation perfectly then we’d have safer roads. That vs engineers elsewhere such as The Netherlands who believe that drivers will not obey every law and thus engineers must design roads to be safe considering our lack of obedience and attention. Per mile driven about two to three times as many people are killed on roads designed by US traffic engineers as on those designed by Dutch traffic engineers.
2) The driver may have been going excessively fast. There are witness reports that the Tesla passed people who were driving 80-85 mph. If so, and combined with the impaired view mentioned above, then it is likely that even if the driver or AP had seen the truck and reacted immediately that neither would have been able to react in time. Over 85 mph would also have disabled emergency braking.
3) The truck driver may have failed to yield. Stopping or slowing and getting going again in a large truck is a real PITA. It can be a slow and arduous task involving a lot of shifting. Not too surprisingly truck drivers try to avoid changes in speed as best they can. Arguably, at an intersection like this it may also be safer for a truck not to stop if they can see no oncoming traffic as this allows them to clear the intersection much faster and so present less risk to others. Regardless, there have been reports that the truck driver failed to properly yield. If, at the time he began to make his left turn, the Tesla was beyond his view, then I’m not sure that this is contributory to the crash (there can be a difference in legal and actual here).
The forward camera and other telemetry in the Tesla may help determine this.
4) There are reports that the Tesla driver was watching a movie at the time of the crash. This is based on witnesses saying that they saw a DVD player in the car with a movie playing. Investigators have said that when they arrived there was a DVD player but no movie playing. What is yet to be determined is if the driver was watching a movie and it ceased playing due to the crash.
Pennsylvania and Montana Crashes
In the PA crash the driver of a Tesla Model X hit the cement barrier between lanes of opposing traffic and rolled over. There were no fatalities and injuries were apparently minor. The driver stated to police that Autopilot was engaged.
Tesla have stated that they did receive an airbag alert but have not yet received telemetry data from the car that will indicate the status of AP and other systems. This latter, according to Tesla, likely because of damage to the antenna during the crash. When Tesla can manually access the telemetry data in the car (E.G., the car’s black box) we will know more. Until that time any speculation about what role, if any, AP played is nothing more than a wild guess.
The Montana crash appears to be a case of someone using AP in an inappropriate situation — middle of the night on a poorly maintained narrow road with poor striping, no shoulder, and wood posts and other obstacles directly adjacent to the road. Tesla’s AP uses the painted lines for lane-keeping. Absent lines it will attempt to follow a vehicle in front of it. Absent these it can do nothing.
Currently we don’t have nearly enough information about these to make any judgements. There have been prior incidents when drivers crashed and blamed their crash on Autopilot only for us to learn later that it was not engaged.
Final note. I have a Tesla Model S and have used every version of Autopilot fairly extensively both for testing/evaluation and as a driver assist feature with my daily driving. It is not perfect but has improved considerably over the past year. It is generally quite reliable (remarkably so) but does occasionally do something unexpected so it is critical that the driver pay close attention at all times and always be ready to take complete control.
The more you use AP the more confident you become in its abilities (and I’d guess with more experience Motortrend would have touched the wheel even less above) and this is where we need to be careful, because AP is good enough that it might be easy to be over-confident in what it can and will do. Even so, I believe it is much safer than not and more importantly is leading to an even safer future.
Even if AP is at some point considered at fault for a crash or fatality, this must be weighed against the crashes and fatalities that it has likely prevented and will prevent in the future.
 Two additional features are sometimes considered part of Autopilot. These are Tesla’s Summon feature that allows an owner to, on a very limited basis, drive the car at slow speed without being in the car, and Autopark that once engaged will park the car in a parking space.