Federal Government Opens Safety Defect Investigation Into Tesla Autopilot Crashes
NHTSA is looking at whether the technology may be a contributing factor in multiple crashes with emergency vehicles
The government’s top auto safety watchdog has opened a preliminary investigation into Tesla’s Autopilot driver assistance system after several crashes in which drivers in Tesla vehicles have struck stopped emergency vehicles. These crashes have been responsible for 17 injuries and one death. As part of the investigation, the agency is requesting information from Tesla regarding the operation of Autopilot, including how the software has changed over time, and how the system is designed to prevent misuse.
According to a letter sent to Tesla by the National Highway Traffic Safety Administration on Aug. 31, 2021, the probe will examine how Tesla’s Autopilot system operates, including how it identifies and reacts to obstacles in the road. But the agency is also evaluating how Autopilot assists and monitors drivers, and how it keeps them engaged with the task of operating the vehicle—a focus that Consumer Reports’ experts and others say is increasingly necessary as automated steering and speed control systems become more prevalent on new vehicles.
Experts, including those at the National Transportation Safety Board, have warned that as systems like Autopilot get better at automating some braking, acceleration, and steering tasks, drivers may start to rely on them too much, even though they are designed to work while there is an attentive driver behind the wheel. A driver using Autopilot, for example, might stop paying attention to the task of driving—even to the point that they might not notice a stopped emergency vehicle with its lights flashing.
“It’s not surprising that there’s going to be situations where people are not going to be constantly looking at the road, even though they’re instructed to,” says Kelly Funkhouser, head of connected and automated vehicle testing at CR. “When you have situations that are out of the ordinary—fire trucks, construction signs—that’s when a driver’s attention is needed most to interpret a complex, often dynamic, scenario.”
Federal lawmakers, safety advocates, and other automakers have long called for better safeguards against driver inattention in vehicles that automate some driving tasks. NHTSA’s Special Crash Investigations team also has been examining serious crashes involving automation. As of May 2021, the agency says it has launched 34 SCI investigations into crashes where some form of vehicle automation was in use—28 involving Tesla vehicles.
“NHTSA should get to the bottom of the issue as quickly as it can, and demonstrate it will hold Tesla accountable if the company won’t put people’s safety first on its own,” says William Wallace, manager of safety policy at CR.
The investigation also suggests that NHTSA is paying more attention to crashes involving software and automation—a change in focus that’s overdue, according to Wallace. In June, the agency started ordering manufacturers to report any crash that happens while a vehicle is automating some driving tasks and an injury or property damage is reported.
Correction: This article, originally published Aug. 16, 2021, has been updated to correct the year that Tesla Autopilot was introduced as a feature that drivers could use. Autopilot debuted in 2015, not 2014.