Tesla Model 3 using Autopilot

The Insurance Institute for Highway Safety has issued a set of research-based guidelines for monitoring whether drivers are paying attention to the road when using high-tech driver assistance systems—adding pressure on the auto industry to address what has become a growing safety concern.

Cars with systems that steer, accelerate, and brake under certain conditions, such as Tesla’s Autopilot or Nissan's ProPilot Assist, should have devices to track a driver’s eye movements and/or head position, IIHS says.

The research group also is recommending that automakers use increasingly urgent signals—visual, audio, and tactile feedback—to get a driver’s attention when needed.

General Motors has developed a system for Cadillac called Super Cruise that allows hands-free driving while it steers itself and maintains a safe distance in traffic, but only if the driver is paying attention. It's the only system that monitors eye movements, using cameras pointed at the driver to determine if his or her eyes are open and looking forward toward the road.

More on Car Safety

The IIHS says its recommendations are backed by new research that shows what would work most effectively in cars on the road today. IIHS points out that its surveys show a consistent misconception by consumers that cars with some driver assistance systems are practically autonomous.

In reality, there's no such thing as a fully autonomous vehicle that can be purchased by consumers, and there are plenty of ordinary circumstances on the road when automated steering, acceleration, or braking don’t work very well, or at all.

This inconsistency has been underscored by some high-profile crashes involving Tesla vehicles (with Autopilot engaged) striking parked emergency vehicles along the roadside or other objects. All automakers offering driver assistance systems emphasize that it’s the driver’s duty to pay attention to the task of driving and to take over if needed. It’s clear that not all drivers are heeding this advice.

"These systems are amazing feats of engineering," says Alexandra Mueller, a research scientist at IIHS that worked on the recommendations. "But they all suffer from the same problem. They don’t account enough for the behavior of the human being behind the wheel."

Systems like Autopilot and Super Cruise—and others by Volvo and Nissan—use cameras, radar, and other sensors to keep a car centered in its lane and control speed so that it remains a set distance from other cars in traffic. They sometimes use high-definition maps to pinpoint a moving car’s location and anticipate upcoming obstacles. They have different ways of gauging whether the driver is paying attention. 

Automated driving functions in the Tesla, Cadillac, Volvo, and Nissan systems use technologies that have been broadly called advanced driver assistance systems, or ADAS. But automatic steering, braking, and acceleration that take control of the driving task under certain circumstances are only a subset of ADAS, which also includes features like parking assist, blind-spot warnings, and automatic emergency braking. 

These kinds of driver assistance systems shouldn’t be allowed to change lanes or pass other vehicles without input from the human driver, IIHS says. Systems from BMW, Mercedes-Benz, and Tesla can do this already, and Cadillac says it will add that feature in 2021, according to IIHS.

Another recommendation from IIHS is that driver assistance systems shouldn’t be allowed to block the human driver’s input when a feature, such as automatic lane centering, is engaged. Autopilot discourages active driver participation by canceling lane-centering if there’s even a minor steering adjustment, IIHS says.

CR's Testing

CR's ratings of driver assistance systems found that Cadillac’s Super Cruise did the best job of monitoring the driver and communicating when the car is in control and when the driver needed to take over. Super Cruise also locks out the feature on roads where it won’t work properly—in Cadillac’s case, that means it only works on interstates and other limited-access highways. Super Cruise clearly indicates through colored lights on the steering wheel when the car is able to control key driving functions and when the driver needs to take control. 

None of the three other automated driving systems CR rated have eye tracking. BMW and Subaru have since come out with driver-facing cameras, but they’re not used for the same kind of automated driving system. 

As cars become more skilled at basic driving functions, there has been a human tendency for drivers to become less engaged, says Jake Fisher, senior director of auto testing at Consumer Reports. Drivers become overconfident that the car is in control, and they can’t react quickly enough in a split-second emergency, he says. 

“The IIHS recommendations are the latest evidence that it’s essential to make sure the driver is looking at the road when using these systems,” Fisher says. “Unfortunately, the rollout of systems like Super Cruise that monitor the driver has been very slow.” 

Consumer Reports has called on any manufacturer offering active driver assistance technologies to include driver monitoring, and for the National Highway Traffic Safety Administration to make sure that happens by setting strong safety rules for these systems.

CR's experts also are urging NHTSA to make much greater use of its authority to require recalls of unsafe driver-assistance systems.

"When a car's driver-assistance system fails to guard against foreseeable safety risks—such as by not having an effective driver monitoring system—NHTSA should consider it defective," says William Wallace, CR's manager of safety policy. 

On Feb. 25, the National Transportation Safety Board called out Tesla and NHTSA on the issue of driver monitoring. Like many first-generation automated driving technology systems, Tesla relies only on a sensor in the steering wheel to measure torque to tell if a driver is holding on to the wheel. 

That kind of system is easily fooled, the NTSB said, and the design was cited as a factor in a fatal 2018 Mountain View, Calif., crash. Autopilot steered the Model X into a highway barrier. The system was controlling the SUV even though the driver had been playing a video game on his mobile phone in the moments leading up to the crash. 

The safety board reiterated a recommendation to Tesla that it develop applications to better sense a driver’s attention and stronger alerts for drivers when they need to take over. It also said Tesla should put more limits on Autopilot, such as locking it out on roads where it’s not designed to be used—like local streets or those with cross traffic. 

NHTSA should use its investigative authority to determine if the fact that drivers are using Autopilot in unintended ways presents an unreasonable safety risk, the NTSB said. The regulator should force Tesla to recall Autopilot if necessary, the safety board said. It also said NHTSA should work with SAE International, the auto industry’s standard-setting organization, to develop standards for driver monitoring. 

Decades of research in all modes of transportation shows that humans come to trust automated controls very quickly and stop paying attention, leading to crashes, the NTSB said. 

Mercedes-Benz GLE with ADAS systems
Mercedes-Benz GLE