Tesla: Should You Pay For My Car Insurance?

Tesla: Should You Pay For My Car Insurance?
By Melinda Leung | June 17, 2022

Tweeter Lede: NHTSA published a report identifying that Tesla is involved in 75% of incidents involving autonomous vehicles. But before we blame Tesla, we need to better understand the data and context behind those numbers.

The National Highway Traffic Safety Association (NHTSA) recently published their first ever report on vehicles using driver-assist technologies. They found that there have been 367 crashes in the last nine months involving vehicles that were using these types of advanced driver assistance systems. Almost 75% of the incidents involved a Tesla system functioning on Tesla’s iconic Autopilot, three of which led to injuries and five resulting in death.

Before we jump the gun, blame Tesla for these issues, and forever veto all self-driving vehicles, we need to take a step back and understand the context behind the numbers.

Levels of Autonomy

First, how do autonomous vehicles even work? There are actually five levels of autonomy.

  • Level 0: We’re still driving. Hence, this is not even considered a level in the autonomy scale.
  • Level 1: The vehicle can assist with basic steering, braking and accelerating.
  • Level 2: The vehicle can control both steering, braking and acceleration (adaptive cruise control, lane keep). However, the human driver still needs to monitor the environment at all times. This is the level Tesla currently is officially at.
  • Level 3: The vehicle can perform most driving tasks. However, the human driver needs to be ready to take control if needed and essentially acts as the figure behind the wheel.
  • Level 4: The vehicle can perform all driving tasks and monitor the driving environment in most conditions. The human doesn’t have to pay attention during those times.
  • Level 5: Forget steering wheels. At this point, the vehicle completely drives for you and the human occupants are just passengers. They are not involved in the driving whatsoever.

There are 5 levels of autonomous vehicles. (Source: Lemberg Law)

Now that we understand autonomy, we now know that there is still a human component to Tesla’s Autopilot feature. Full self-driving isn’t fully here yet, so accidents that occur with driver-assisted technologies still very much involve human interaction.

So Can We Blame Tesla Yet?

Not quite yet. Because Tesla is the brand name associated with autonomous vehicles, it is no surprise that they happen to also sell the largest number of vehicles with the most advanced driver assistance technologies. Therefore, by being purely the largest and most well known amongst the autonomous vehicle industry, it is not surprising that they are responsible for the largest count of crashes that occurs. What is more useful is to understand the percentage of accidents that occur by the number of miles driven. A classic base rate fallacy.

Unlike most automakers, Tesla also knows exactly which vehicles were using Autopilot at the time of a crash. Its vehicles are equipped with cellular connectivity that automatically reports this information back to Tesla when a crash occurs. Not all vehicles do so. Therefore, Tesla’s systems may also be better at relaying crash information than others.

Next, what if the crash was going to happen irregardless if the vehicle was in Autopilot or not. For example, if the car behind you was driving too quickly and rear-ended you, it didn’t matter who was driving: you would have been hit no matter what. Because we don’t really have any context in the type of accident, it makes it difficult to understand who is at fault.

Lastly, according to NHTSA, these companies need to document crashes if any automated technologies were used within 30 seconds of impact. According to Waymo, which is Google’s autonomous driving division, a third of its reported crashes took place when the vehicle was in manual mode but still fit within this 30 second range. They are one of the oldest players in this industry and we can extrapolate and expect similar stats for Tesla. If that’s the case, it’s really difficult to 100% blame Tesla.

If two cars on Autopilot crash and this is a common occurrence, then yes, let’s make Elon Musk pay for our increased car insurance policies. But until we have a lot more data about the conditions of these crashes, it’s hard for us to determine who is really at fault and make sweeping assumptions about the safeness of these vehicles.

References

  1. National Highway Traffic Safety Association. (2022, June). Summary Report:Standing General Order on Crash Reporting for Level 2 Advanced Driver Assistance Systems. https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf
  2. Lemberg Law. What You Need to Know about Driverless Cars. https://lemberglaw.com/are-driverless-cars-safe/ 
  3. McFarland, Matt. (2022, March 16). CNN. Tesla owners say they are wowed — and alarmed — by ‘full self-driving’. https://www.cnn.com/2021/11/03/cars/tesla-full-self-driving-fsd/index.html 
  4. Hawkins, Andrew. (2022, June 15). The Verge. US releases new driver-assist crash data & surprise, it’s mostly Tesla. https://www.theverge.com/2022/6/15/23168088/nhtsa-adas-self-driving-crash-data-tesla