Cadillac Super Cruise, with hands off wheel. Researchers worry there may be more crashes related to automated driving systems.

The front half of the Tesla Model S sat destroyed, fully wedged under the back of a fire truck on a major highway just outside Salt Lake City.

The driver told local crash investigators she’d been using the Autopilot driver-assist system on her Tesla. It wasn’t until later they learned from the automaker that her hands were off the wheel for 80 seconds before impact.

This was the first time firefighters with the Unified Fire Authority responded to a car crash related to partially automated driving systems, according to assistant chief Mike Watson. His department has seen lots of crashes caused by distracted or impaired drivers, but this was completely new territory.

“You can get your arms around, ‘Oh, the driver was distracted.’ You can get your arms around, ‘Oh, the driver was impaired,’” he told CR. “We don’t have our arms around this technology.”

Crashes like this one may well represent the canary in a coal mine, as more vehicles with partially automated steering and speed control start arriving at dealerships and at more affordable prices—introducing new risks as these features seek to make driving easier and more convenient.

More on Care Safety

The systems take over some—but not all—driving tasks from humans. These convenience features include self-steering technology and adaptive cruise control—which lets your car keep pace with the car ahead.

CR experts and several other experts interviewed for this report predict more crashes in the next few years like the one in South Jordan, Utah, where drivers appear to rely on the new technology to drive for them.

"These systems don't make the vehicle self-driving by any means,” says Jake Fisher, director of auto testing at Consumer Reports. “They require an attentive driver who’s able to take over the controls at a moment's notice. This need for a handoff may be a recipe for disaster."

There appear to be no official predictions from government or academic sources about the number of future crashes expected from this new brand of distracted driving, or whether those systems may reduce other kinds of crashes. But at least four Tesla cars have crashed into stopped emergency vehicles over the last year. In some of the cases, the drivers reported that they were using Autopilot.

In response to CR's questions to Tesla about how the company ensures drivers are prepared to take over when Autopilot disengages, a Tesla spokesperson told CR that Autopilot operates in conjunction with the vehicle’s driver and requires direct supervision at all times.

Some of these convenience features are packaged together by automakers and given names, such as Tesla's Autopilot or Nissan/Infiniti’s ProPilot Assist. Other automakers offer them as individual features available with various option packages.

Automated steering and acceleration are currently available in luxury vehicles from BMW, Cadillac, Mercedes-Benz, Volvo, and others. But mainstream vehicles are getting them, too: The new 2019 Nissan Rogue and Altima offer automated steering and speed control, and General Motors has promised more automation in less expensive vehicles by 2020.

From Runways to Roadways

This new type of distracted driving has nothing to do with the technological strides made in advanced automotive safety systems, such as forward collision warning (FCW), automatic emergency braking (AEB), or blind-spot warning (BSW). Research shows those features undoubtedly reduce crashes on the nation’s roads.

By contrast, partially automated driving systems haven't been shown to either increase or decrease safety—but decades of research suggests a potentially dangerous relationship unfolding between old and new driving technology, and between old and new driver behavior.

In aviation, the problem is called mode confusion—and it also applies to cars. “Operators, whether they’re pilots or they’re drivers, do not understand the mode that the automation is in, and what its limitations are," explains Missy Cummings, a former fighter pilot who now directs the humans and autonomy lab at Duke University in Durham, N.C.

Cummings says she's concerned about how drivers will react to the systems. "The more we put in these ambiguous driver-assist technologies that potentially can drive the car in some limited capacity, we will see more of those accidents, absolutely."

CR believes industry and policymakers need to do more to prepare for what could be a deluge of drivers on the road who don't fully understand the limitations of these new systems.

David Friedman, vice president of advocacy for Consumer Reports, has dubbed the problem “automation-induced distraction.”

“With or without these technologies, drivers need to pay attention, but automakers have a responsibility to put safety first, too,” Friedman says. “By taking a few straightforward steps—like warning drivers that they have tuned out, ensuring driving automation is available only when it is safe, and improving emergency fallback systems—car companies can reduce the risk of automation-induced driver distraction.”

Some of those suggestions have already shown up in new vehicles. For instance, Cadillac’s Super Cruise uses eye-tracking technology to ensure that the driver's eyes are on the road when the system is turned on and limits the system’s use to limited-access highways—approaches that have won praise from CR’s testers. A spokesman for Cadillac declined to comment for this story.

For Cummings, even driver monitoring doesn't go far enough to make the technology safe. "We should either fully automate or we should leave people substantially in the loop," she says.

The 2019 Nissan Altima, a vehicle that will now offer self-steering and semi-autonomous acceleration functions. Researchers worry there may be more crashes related to semi-autonomous cars.
The 2019 Nissan Altima.

There Are No Self-Driving Cars

According to a recent AAA survey, drivers who already own cars with these systems say they don't fully know how they work. And 29 percent of respondents said they felt "comfortable engaging in other activities while driving" when adaptive cruise control was turned on.

Systems like Tesla’s Autopilot aren’t designed to handle every situation—even stopping for a fire truck with lights flashing. They're also not meant to be used on the kind of busy city streets where some Autopilot-related crashes have taken place.

"The problem is that these drivers are assuming that their cars are in control of the situation—but they’re not,” says Kelly Funkhouser, program manager for vehicle usability and automation at Consumer Reports.

Funkhouser says human nature is more to blame than technological limitations. “When humans are asked to monitor automation, they lose interest," she says. "That leads to slower reaction times, and can cause crashes.”

In research she conducted when she was at the University of Utah, Funkhouser found that driver reaction times lengthened by almost 50 percent when partial automation systems were used. The longer the features were turned on, the worse the driver's reaction time, her research found. Drivers who used a phone while driving were worse off with the automation systems turned on.

The 2016 Tesla Model S that crashed into a fire truck in South Jordan, Utah, while on autopilot — autonomous car crash.
The Tesla Model S that crashed into a fire truck in South Jordan, Utah.

The growing pains of automation are familiar to NASA research psychologist Steve Casner. "We did this 30, 40 years ago,” he says.

That’s when aircraft manufacturers started using automation to reduce the most common crashes. For the most part, it worked. “What we didn’t anticipate,” says Casner, was that “some new problems arose from the way people interact with the automation.”

Pilots made new—and different—errors because they started relying on the new technology, which led to new kinds of crashes that the industry had never seen before, Casner says.

For example, in August 1987, the pilots of Northwest Airlines flight 255 out of the Detroit area got distracted while performing a pre-flight checklist and forgot to properly configure their McDonnell Douglas MD-80 aircraft for takeoff.

Normally, an automated system would have warned the pilots of their error, but an undiscovered electrical fault kept a vital alarm from sounding. The plane failed to gain altitude and crashed, killing 156 people.

“Instead of looking out for problems, you now just lend an ear to listening to the warning system to see if it goes off,” he said. Pilots had to be specially trained to fly with automated systems, and the systems themselves kept evolving—which ultimately made the skies much safer, he said.

The same thing could happen on the highways someday, Casner says. “Automation with a driver who intelligently uses that automation, that’s a powerful combination, that’s going to save lives.”

But there are a lot more drivers on the road than commercial pilots in the air, and making partial automation safer isn't just about additional training. Rather, it's an issue that car companies must address, CR's Fisher says.

“Most of today’s systems don’t do enough to make sure drivers are paying enough attention to their situation,” he says.

Additionally, many drivers don’t understand the limitations of driver-assist technology, even though it's now becoming more mainstream and affordable than ever before.

Nissan with ProPilot Assist. Researchers worry there may be more crashes related to semi-autonomous cars.

What Consumers Can Do

Even today, informed drivers can still protect themselves, Funkhouser says. Before they get behind the wheel, car shoppers need to understand more about partial automation, including where it can be safely used, how it’s supposed to work, and what the car does and doesn’t do for them. This will become even more important because so many will not have encountered it before.

Even having just a general understanding of how these systems work could help drivers avoid a crash, NASA's Casner told CR.

"Unless you really know something about how that automation works and know all its limitations, you’re going to be unprepared," he said. "The computer is not nearly as good as you are at detecting these bizarre situations."

Drivers become more aware of partial automation’s limitations after they’ve spent time behind the wheel, says David Kidd, senior research scientist with the Highway Loss Data Institute. To get a sense of how these systems operate, Kidd recommends that shoppers experience the technology while test driving several cars, in addition to reading the owner’s manual. “The way they’re implemented across vehicles is very different,” he says.

And they need to be aware of these systems even if they aren’t looking to use them, Fisher says, because they may be forced to buy them as part of option packages with advanced safety equipment they want, such as AEB and FCW. These technologies are often bundled together in an option package. "Take time reading the manual and understanding the settings,” he says. “Enable the safety features, but for now you may be safer steering for yourself."

Watson, the assistant fire chief in Utah, said that although he’s optimistic about the future of automation in cars, he’d give similar advice to his own family.

“If my son had a car that had that technology, I would still have a conversation to say, ‘That doesn’t mean you get to check out, that doesn’t mean you don’t have to pay attention, it means this is supposed to help you while you drive.’”

Editor's Note: After publication, Consumer Reports was contacted by Tesla, and the story was updated to reflect the company's response to our questions.