The front of a Tesla Model S.

Update: On Sept. 4, the NTSB found that the design of Tesla's Autopilot—plus driver inattention—were partly to blame for a Model S crashing into a stationary fire truck last year. No one was injured in the incident. Click here for the full story.

The story below was originally published on Sept 3.   

A Tesla Model S driver who hit a fire truck in a January 2018 crash on a California highway was using Autopilot and rarely touched his steering wheel during the last 13 minutes and 48 seconds leading up to the collision, according to new investigation documents.

The driver, Robin Geoulla, appeared to be looking down and not at the road at impact, according to an eyewitness who saw the Model S pass him on the Culver City highway. The National Transportation Safety Board, which has been investigating the crash, made statements from the driver and the witness, and technical findings of crash investigators, public on its website Tuesday. 

Phone records indicate Geoulla wasn’t using his phone to make a call or text before the crash, the NTSB said. Investigators didn’t rule out the possibility that he was looking at his phone for some other reason. Geoulla told police at the crash scene that he had been looking at his radio. He told the NTSB he saw the fire truck for the first time after the crash. 


As systems such as Autopilot become better at steering and braking, it’s extremely important that consumers are aware of the systems’ limitations, according to Kelly Funkhouser, head of connected and automated vehicle testing at Consumer Reports. It’s human nature to become complacent after technology works well numerous times, Funkhouser says.

“This is not about blaming the driver or requiring driver training,” Funkhouser says. “This is about designing a system that safeguards drivers and keeps them engaged.”

Advanced driver assistance systems, such as Autopilot, aren’t the same as self-driving cars—meaning the human driver is still responsible for paying attention to the road. CadillacInfinitiMercedes-BenzNissan, and Volvo offer systems similar to Autopilot, under various names. These systems can maintain a vehicle’s place in the flow of traffic and keep it within the lines of its lane well enough to lull drivers into complacency. Only Cadillac’s Super Cruise has a driver-facing camera that will issue warnings if the driver stops looking at the road.

Like in some other high-profile Tesla crashes, the Model S was in Autopilot mode as it followed another vehicle in traffic. When the leading vehicle changed lanes, Geoulla said his vehicle didn’t slow down, and he didn’t see the fire truck until he hit it.

The NTSB’s report said the Model S hit the fire truck at a speed between 4 and 24 mph. Geoulla told police and investigators he thought he had been going 65 mph and that Autopilot had been set at 65 mph. Postcrash photos show the car’s front end smashed and the airbags deployed. Neither Geoulla nor the occupants in the fire truck were injured. 

Geoulla was touching the steering wheel for only 51 seconds of the final 13 minutes and 48 seconds leading up to the crash, the NTSB report says. Geoulla told the NTSB that he bought his Model S because he wanted Autopilot. He added that he knew Autopilot wasn’t able to fully drive his car.

Autopilot in Crashes

In his interview with NTSB investigators, Geoulla said that he used Autopilot from the first days he owned his Model S and that he quickly learned the system’s flaws, according to a transcript of an interview he gave to NTSB investigators. He said that his Autopilot was unreliable on roads with sharp turns and could become confused when a car directly in front left the lane. Autopilot also didn’t work well when driving directly into the sun, he told the NTSB.

“I think it was named wrong,” Geoulla told NTSB investigators. “It’s a very good cruise control with a little bit more advanced technology in it.”

Autopilot has been engaged in at least three fatal crashes—all of which have been investigated by the NTSB. The safety board investigates a limited number of highway crashes each year, and it has taken a special interest in incidents involving new driver assist technology. In its probe into a fatal May 2016 Florida crash, the NTSB faulted Tesla for not doing more to program Autopilot so that it could be used only on highways with exit ramps. The safety board also said the auto industry more broadly needed to do more with technology to monitor whether drivers were paying attention to the road. 

Several Tesla crashes follow a common scenario. A Tesla vehicle operating on cruise control is following another vehicle. The lead vehicle suddenly leaves the lane to avoid something ahead that’s stationary or moving slowly. The Tesla driver assist systems don’t have time to react to the object suddenly in its path, such as a stopped fire truck, and there’s a collision.

A Tesla spokeswoman, Danielle Meister, didn’t immediately respond to an email seeking comment on the NTSB documents. 

Tesla has said that limitations of Autopilot, adaptive cruise control, and automatic emergency braking are detailed in its owner’s manuals. Adaptive cruise control cannot brake or accelerate in response to stationary vehicles, especially when traveling faster than 50 mph. AEB is designed to reduce the severity of an impact, not to avoid a collision, the company says, and Autosteer won’t steer a vehicle around objects that jut into the driving lane.

After a fatal crash in March 2018, Tesla said that Autopilot doesn’t prevent all accidents and that such a standard would be impossible. Autopilot makes it less likely that crashes will occur and it “unequivocally makes the world safer for the vehicle occupants, pedestrians, and cyclists,” it said.