Tesla’s Camera-Based Driver Monitoring Fails to Keep Driver Attention on the Road, CR Tests Show
GM's Super Cruise system does a far better job at keeping drivers engaged
Update: After this article was first published, on Dec. 22, 2021, Tesla sent software updates to our vehicles. Our Tesla Model Y received software version v11.0 (2021.44.25.2); our Model S, version FSD beta 10.8 (2021.44.25.6). We downloaded and installed these updates and reevaluated the vehicles. We found no difference in how their driver monitoring systems performed from when we tested software versions v10.2 (2021.32.22) on the Model Y and FSD beta 10.4 (2021.36.8.6) on the Model S in late November and early December.
In both cars, we were still able to engage Autopilot and Full Self-Driving beta with their cabin cameras fully covered. We will continue to evaluate the latest versions of software that our tested cars receive from Tesla and other manufacturers.
Original article: A safety feature designed to ensure that Tesla drivers keep their eyes on the road while Autopilot automates some braking, acceleration, and steering tasks performed poorly in Consumer Reports’ tests.
Earlier this year, Tesla announced that it had activated in-car cameras built into some of its cars—which the automaker calls “cabin cameras”—so they could detect driver inattentiveness when Autopilot is in use, then alert drivers when they need to pay more attention. Tesla’s announcement followed calls from Consumer Reports and others for all automakers to install effective driver monitoring systems on any vehicle that automates certain driving functions, and to use monitoring technology to prevent drivers from using those automation features if they appear to not be looking at the road. Most of these systems use eye or head tracking to determine where a driver is looking.
When CR tested Tesla’s cabin cameras in our Model S and Model Y, we found:
• Drivers could still use Autopilot if they were looking away from the road and while using their phone.
• Even if the vehicle’s camera was obscured, Autopilot remained active and didn’t prohibit the driver from using the system.
• We could use Tesla’s Full Self-Driving (FSD) beta software with the vehicle’s camera blocked.
“It is proven that drivers pay less attention to the road when a vehicle is automating some driving tasks, and therefore they may have trouble reacting in time in an emergency if they need to take back control,” says Kelly Funkhouser, manager for vehicle technology at CR.
Consumer Reports’ tests of our Model Y and Model S did find that the amount of time the driver could have their hands off the wheel was shortened if the camera detected that the driver’s eyes were off the road. But as long as the driver’s hands remained on the wheel, as Autopilot instructs, we saw no difference in warnings whether eyes were on or off the road.
CR previously tested other driver monitoring cameras in vehicles from BMW and Subaru. While the BMW camera is used in conjunction with its Traffic Jam Assist feature at speeds below 40 mph to allow hands-free driving, it isn’t active at higher speeds when adaptive cruise control and lane keeping are both active. A BMW spokesperson told CR that the system is designed this way to reinforce that completely hands-free driving isn’t available at speeds above 40 mph.
The Subaru camera is able to detect driver distraction in some scenarios, but it can be completely shut off via the menu, and it’s not required to be activated to use the driver assistance systems, although a Subaru spokesperson told CR that the company “would not rule out” requiring the camera for adaptive cruise control in the future. As with Autopilot, these systems let people engage their active driving assistance systems while their driver monitoring cameras are completely covered with a physical barrier, or toggled off in the menu.
CR also found that we could enable Tesla’s FSD beta—an evolving collection of features that can assist the driver with navigating to a destination, coming to a complete halt at traffic lights and stop signs, and making turns on city streets—with the camera covered with no difference in performance or ability to enable the feature on city streets. “It’s quite a disappointment that Tesla has a driver monitoring camera, yet they aren’t actually requiring it to be used when Autopilot and FSD are engaged,” Funkhouser says.
By contrast, when we tested vehicles equipped with GM’s Super Cruise driving assistance system, which uses infrared cameras to track a driver’s eye and head position, covering up their cameras disabled Super Cruise.
Our findings are supported by evaluations from researchers at MIT’s Advanced Vehicle Technology Consortium (AVT), who performed similar tests on their Tesla Model 3. With Autopilot engaged, they were unable to trigger a warning through a range of testing procedures, which included the driver texting on a smartphone and entirely blocking the view of the road by holding a clipboard in front of the driver’s face.
“In the Tesla Model 3 we studied, it’s not quite clear how the camera-based driver monitoring system is supporting the driver,” says Bryan Reimer, a research scientist in the MIT AgeLab and associate director of the New England University Transportation Center at MIT. He also leads the AVT.
The lack of clarity can further undermine the safety benefits of a driver monitoring system, Reimer says. By contrast, GM’s Super Cruise, which delivers multiple warnings to grab a distracted driver’s attention, is designed to make clear what a driver’s responsibilities are, he says. “It’s very easy to understand and see how Super Cruise’s driver monitoring is shaping the behavior of the operator.”
Initially, Tesla’s decision to add camera-based driver monitoring seemed like a step in the right direction, but Funkhouser says the camera doesn’t do enough to keep drivers engaged. Tesla vehicles continue to use torque sensors to determine whether a driver is applying pressure to the steering wheel, and Autopilot will still disengage if it senses that a driver’s hands have been off the wheel for too long.
But simply having hands on the wheel doesn’t mean a driver is paying attention, says Jake Fisher, senior director of CR’s Auto Test Center. “To keep people safe, the system should prevent the driver from using active driving assistance if the driver stops looking at the road,” he says. “Tesla’s system simply doesn’t do that.”
Editor’s Note: This version was updated on Jan. 15 to remove an incorrect reference to the model year of our Model S and Model Y.