Safety Agency Expands Tesla Autopilot Investigation

The scope of the NHTSA investigation covers about 830,000 electric cars that might have an increased crash risk

2021 Tesla Model S hands-off the yoke Photo: John Powers/Consumer Reports

The National Highway Traffic Safety Administration has upgraded its investigation into Tesla’s Autopilot active driver assistance system from a preliminary evaluation to an engineering analysis. This means the safety agency will extend its crash analysis of incidents involving Autopilot by looking at additional data and performing vehicle evaluations. Plus, NHTSA will explore how Autopilot’s design might increase the risk of a crash that happens after a driver stops paying attention to the road.

NHTSA’s look into Autopilot began in August 2021 after an increasing number of crashes with Autopilot engaged were reported, including collisions with parked and emergency vehicles. The scope of the ongoing investigation covers about 830,000 Teslas from the 2014 to 2022 model years, including the four current vehicles: Model 3, Model Y, Model S, and Model X


In August and September, NHTSA sent letters to Tesla and 12 other automakers requesting data on crashes and information about their active driving assistance systems. 

The safety agency found that there were 16 crashes involving a Tesla striking first responder and road maintenance vehicles. Many of these incidents had some form of intervention from the forward collision warning and/or automatic emergency braking systems, but on average, Autopilot aborted vehicle control less than 1 second prior to impact. Of those crashes, NHTSA found that driver attention warnings were issued in just two cases. 

In addition, NHTSA reviewed 191 other crashes involving Autopilot. 

Experts, including those at the National Transportation Safety Board, have warned that as systems like Autopilot get better at automating some braking, acceleration, and steering tasks, drivers may start to rely on them too much, even though they are designed to work while there is an attentive driver behind the wheel. A driver using Autopilot, for example, might stop paying attention to the task of driving—even to the point that they might not notice a stopped emergency vehicle with its lights flashing. 

Federal lawmakers, safety advocates, and other automakers have long called for better safeguards against driver inattention in vehicles that automate some driving tasks. 

“NHTSA is right to step up its safety investigation into Tesla’s driver-assist systems,” says William Wallace, associate director of safety policy at Consumer Reports. “If it’s foreseeable that some part of a car’s design increases the risk of a crash, then the manufacturer needs to fix the car right away. The company can rework how a feature operates, or add safeguards like an effective driver monitoring system, or do some combination of both—but it must take action. And if the company doesn’t address the problem voluntarily, NHTSA should force them to do so.”

Read: “Tesla’s Camera-Based Driver Monitoring Fails to Keep Driver Attention on the Road, CR Tests Show.”

A Timeline of Tesla's Self-Driving Aspirations

From the first mandatory federal auto safety regulations to Elon Musk’s wildest dreams, here’s the history of how Tesla got to where it is today.

Jeff S. Bartlett

A New England native, I have piloted a wide variety of vehicles, from a Segway to an aircraft carrier. All told, I have driven thousands of vehicles—many on race tracks across the globe. Today, that experience and passion are harnessed at the CR Auto Test Center to empower consumers. And if some tires must be sacrificed in the pursuit of truth, so be it. Follow me on Twitter (@JeffSBartlett).