Federal Government Opens Safety Defect Investigation Into Tesla Autopilot Crashes

NHTSA is looking at whether the technology may be a contributing factor in multiple crashes with emergency vehicles

Tesla Model S driving Photo: Tesla

The government’s top auto safety watchdog has opened a preliminary investigation into Tesla’s Autopilot driver assistance system after several crashes in which drivers in Tesla vehicles have struck stopped emergency vehicles. These crashes have been responsible for 17 injuries and one death. As part of the investigation, the agency is requesting information from Tesla regarding the operation of Autopilot, including how the software has changed over time, and how the system is designed to prevent misuse.

According to a letter sent to Tesla by the National Highway Traffic Safety Administration on Aug. 31, 2021, the probe will examine how Tesla’s Autopilot system operates, including how it identifies and reacts to obstacles in the road. But the agency is also evaluating how Autopilot assists and monitors drivers, and how it keeps them engaged with the task of operating the vehicle—a focus that Consumer Reports’ experts and others say is increasingly necessary as automated steering and speed control systems become more prevalent on new vehicles.

MORE ON TESLA

“These types of systems are not yet capable of detecting every kind of unexpected situation and preventing a crash,” says Jake Fisher, senior director of auto testing at Consumer Reports. “So for the foreseeable future, the systems will need to make sure the driver is engaged.”

Since Autopilot’s debut in 2015, regulators and safety advocates have been scrutinizing crashes involving the technology—including a March 2018 crash that killed the driver of a Tesla Model X in California—not only because the feature automates some driving tasks but also because it may encourage drivers to pay less attention even while they are still fully responsible for the car’s actions.

The latest investigation is focusing on 11 crashes from across the country where Autopilot or Traffic Aware Cruise Control was confirmed to be active just before the collision. Most of these crashes took place after dark, and they involved crash or construction scenes with flashing emergency lights or other traffic control measures blocking or parked next to the road. The letter also makes a number of specific requests, including any information the automaker is aware of related to any other Autopilot-involved crashes, complaints, lawsuits, or reports of property damage. Tesla will also have to share information on how Autopilot has been marketed, detailed explanations of changes to Autopilot software over time, and examples of how Tesla validates and tests Autopilot software. Tesla has until Oct. 22, 2021, to furnish these documents, request an extension, or explain why the documents cannot be shared because of confidentiality concerns—or the automaker may face civil penalties of up to $114,954,525.

The investigation covers 765,000 Model 3, Model S, Model X, and Model Y electric vehicles from the 2014 through 2021 model years. A defect investigation can lead to a product recall; automakers also can be fined if NHTSA finds that a company failed to report a safety defect in a timely manner.

CR emailed Tesla for comment, but the automaker did not respond.

Although this investigation involves only Tesla models, the problem of ensuring that drivers are paying attention as vehicles increasingly automate some driving tasks is not unique to Tesla, Fisher says. “Tesla vehicles have had Autopilot for several years, but most other automakers are now offering similar systems as well,” he says.

A 2018 crash in Utah. The driver of the Tesla Model S said she was using Autopilot when the car hit the back of a fire truck.

Experts, including those at the National Transportation Safety Board, have warned that as systems like Autopilot get better at automating some braking, acceleration, and steering tasks, drivers may start to rely on them too much, even though they are designed to work while there is an attentive driver behind the wheel. A driver using Autopilot, for example, might stop paying attention to the task of driving—even to the point that they might not notice a stopped emergency vehicle with its lights flashing. 

“It’s not surprising that there’s going to be situations where people are not going to be constantly looking at the road, even though they’re instructed to,” says Kelly Funkhouser, head of connected and automated vehicle testing at CR. “When you have situations that are out of the ordinary—fire trucks, construction signs—that’s when a driver’s attention is needed most to interpret a complex, often dynamic, scenario.”

Federal lawmakers, safety advocates, and other automakers have long called for better safeguards against driver inattention in vehicles that automate some driving tasks. NHTSA’s Special Crash Investigations team also has been examining serious crashes involving automation. As of May 2021, the agency says it has launched 34 SCI investigations into crashes where some form of vehicle automation was in use—28 involving Tesla vehicles.

“NHTSA should get to the bottom of the issue as quickly as it can, and demonstrate it will hold Tesla accountable if the company won’t put people’s safety first on its own,” says William Wallace, manager of safety policy at CR.

The investigation also suggests that NHTSA is paying more attention to crashes involving software and automation—a change in focus that’s overdue, according to Wallace. In June, the agency started ordering manufacturers to report any crash that happens while a vehicle is automating some driving tasks and an injury or property damage is reported.

Correction: This article, originally published Aug. 16, 2021, has been updated to correct the year that Tesla Autopilot was introduced as a feature that drivers could use. Autopilot debuted in 2015, not 2014.


Head shot photo of CRO Cars CIA editor Keith Barry

Keith Barry

Despite my love for quirky, old European sedans like the Renault Medallion, it's my passion to help others find a safe, reliable car that still puts a smile on their face—even if they're stuck in traffic. When I'm not behind the wheel or the keyboard, you can find me exploring a new city on foot or planning my next trip.