Tesla’s ‘Full Self-Driving’ Beta Software Used on Public Roads Lacks Safeguards

Consumer Reports' car safety experts worry that Tesla continues to use vehicle owners as beta testers for its new features, putting others on the road at risk

Tesla Model Y Photo: Tesla

After Tesla released the latest prototype version of its driving assistance software last week, reports from owners have gained the attention of researchers and safety experts—both at CR and elsewhere—who have expressed concerns about the system’s performance and safety.

CR plans to independently test the software update, popularly known as FSD beta 9, as soon as our Model Y SUV receives the necessary software update from Tesla. So far, our experts have watched videos posted on social media of other drivers trying it out and are concerned with what they’re seeing—including vehicles missing turns, scraping against bushes, and heading toward parked cars. Even Tesla CEO Elon Musk urged that drivers use caution when using FSD beta 9, writing on Twitter that “there will be unknown issues, so please be paranoid.”

FSD beta 9 is a prototype of what the automaker calls its "Full Self-Driving” feature, which, despite its name, does not yet make a Tesla fully self-driving. Although Tesla has been sending out software updates to its vehicles for years—adding new features with every release—the beta 9 upgrade has offered some of the most sweeping changes to how the vehicle operates. The software update now automates more driving tasks. For example, Tesla vehicles equipped with the software can now navigate intersections and city streets under the driver’s supervision.

“Videos of FSD beta 9 in action don’t show a system that makes driving safer or even less stressful,” says Jake Fisher, senior director of CR’s Auto Test Center. “Consumers are simply paying to be test engineers for developing technology without adequate safety protection.”

As with previous updates, CR is concerned that Tesla is still using its existing owners and their vehicles to beta-test the new features on public roads. This worries some road safety experts because other drivers, as well as cyclists and pedestrians, are unaware that they’re part of an ongoing experiment that they did not consent to.

MORE ON TESLA

Bryan Reimer, a professor at MIT and founder of the Advanced Vehicle Technology (AVT) consortium, a group that researches vehicle automation, told CR that “while drivers may have some awareness of the increased risk that they are assuming, other road users—drivers, pedestrians, cyclists, etc.—are unaware that they are in the presence of a test vehicle and have not consented to take on this risk.”

Tesla’s approach stands in contrast to other companies developing self-driving car technology, such as Argo AI, Cruise, and Waymo. We talked to those companies for this article, and they told us that they limit their software tests to private tracks and/or use trained safety drivers as monitors. The representatives we interviewed from these companies each declined to directly address our questions about Tesla and would speak about only their own research programs. Tesla did not respond to CR requests for comment for this article. 

Fisher and others believe that Tesla should at the very least be monitoring drivers in real time to ensure that they’re paying attention while using new software. For example, Fisher says the updated software has “impressive” onscreen graphics, but he’s worried that even a brief glance by the driver at the display might be too long to prevent the system from crashing into a car or pedestrian. Some new Tesla vehicles come with real-time driver monitoring cameras enabled, although CR’s experts have raised questions about their effectiveness.

“Tesla just asking people to pay attention isn’t enough—the system needs to make sure people are engaged when the system is operational,” says Fisher, who recommends that Tesla use in-car driver monitoring systems to ensure that drivers have their eyes on the road. “We already know that testing developing self-driving systems without adequate driver support can—and will—end in fatalities.”

An Uber self-driving test vehicle struck and led to the death of 49-year-old Elaine Herzberg in 2018 as she crossed a street in the Phoenix area. An investigation found that the driver monitor in the back seat of the test vehicle was distracted at the time and did not override the system to stop. A federal investigation partially blamed insufficient regulation and inadequate safety policies at Uber for the collision. After that tragedy, many companies enacted stricter internal safety policies for testing self-driving systems, such as monitoring drivers to ensure that they aren’t distracted and relying more on simulations and closed test tracks.

Elon Musk - Twitter
A tweet from Tesla CEO Elon Musk.

Twitter Twitter

‘Like a Drunk Driver’

In addition to CR’s engineers, industry experts who have watched videos of the new Full Self-Driving software in action expressed concern about its performance.

Selika Josiah Talbott, a professor at the American University School of Public Affairs in Washington, D.C., who studies autonomous vehicles, said that the FSD beta 9-equipped Teslas in videos she has seen act “almost like a drunk driver,” struggling to stay between lane lines. “It’s meandering to the left; it’s meandering to the right,” she says. “While its right-hand turns appear to be fairly solid, the left-hand turns are almost wild.”

A YouTube video uploaded by user AI Addict shows some impressive navigation around parked vehicles and through intersections, but the car also makes a multitude of mistakes: During the drive the Tesla scrapes against a bush, takes the wrong lane after a turn, and heads directly for a parked car, among other problems. The software also occasionally disengages while driving, abruptly giving the driver the responsibility of driving.

“It’s hard to know just by watching these videos what the exact problem is, but just watching the videos it’s clear (that) it’s having an object detection and/or classification problem,” says Missy Cummings, an automation expert who is director of the Humans and Autonomy Laboratory at Duke University in Durham, N.C. In other words, she says, the car is struggling to determine what the objects it perceives are or what to do with that information, or both.

According to Cummings, it may be possible for Tesla to eventually build a self-driving car, but the amount of testing the automaker has done so far would be insufficient to build that capacity within the existing software. “I’m not going to rule out that at some point in the future that’s a possible event. But are they there now? No. Are they even close? No.”

AI Addict / YouTube - Tesla FSD
AI Addict's Tesla scraping up against some shrubbery. New graphics are visible onscreen.

AI Addict / YouTube AI Addict / YouTube

A Safer Approach?

Talbott, the American University professor, says that Tesla’s approach to testing software on public roads without any driver monitoring makes the automaker a “lone wolf” in what she otherwise considers a safe industry. “The rest of the industry says, ‘We want to get it right, we want confidence from the general public, we want regulators to be partners,’” says Talbott, who is also a former motor vehicle regulator and product liability attorney. So far, “getting it right” has meant the use of trained safety drivers and closed test courses to validate vehicle software before testing on public roads, she says.

CR spoke with representatives from Argo AI, Cruise, and Waymo, which are all testing self-driving car prototypes on public roads and working to create commercially available fully self-driving vehicles. Although none would comment on Tesla directly, all three pointed us to their own safety policies. 

“Each change of our software undergoes a rigorous release process and is tested through a combination of simulation testing, closed-course testing, and driving on public roadways,” Waymo spokesperson Sandy Karp told CR. 

Ray Wert at Cruise told CR that the company’s autonomous vehicles would be deployed only after the company is satisfied that the vehicles are safer than a human driver. “Our plan is to launch with communities, not at them,” he says. And an Argo AI representative pointed to the company’s extensive driver training and software development standards, which are all publicly available online.

Partners for Autonomous Vehicle Education (PAVE), an industry group, told CR that it supports the use of highly trained professional test safety operators and that using untrained safety drivers may jeopardize greater public acceptance of vehicle automation. 

Ultimately, it may be up to federal regulators to determine what kinds of vehicle software can be used on public roads—a step that they’ve been reluctant to take so far. “I have never seen anything like we’re seeing today with Tesla, where it’s as if the U.S. Department of Transportation has blinders on when it comes to the actions of this particular company,” Talbott says.

But there have been some recent movements toward further regulation. For example, the National Highway Traffic Safety Administration has ordered new reporting requirements for vehicles involved in crashes while automation was in use.

“I am hopeful that the data will allow NHTSA to better monitor and manage automotive and technology companies working to perfect automation systems over the coming decades,” Reimer says.

Eventually, regulators will have to catch up with automakers’ plans to test rapidly evolving autonomous vehicle technology.

“Car technology is advancing really quickly, and automation has a lot of potential, but policymakers need to step up to get strong, sensible safety rules in place,” says William Wallace, manager of safety policy at CR. “Otherwise, some companies will just treat our public roads as if they were private proving grounds, with little holding them accountable for safety.”


Head shot photo of CRO Cars CIA editor Keith Barry

Keith Barry

Despite my love for quirky, old European sedans like the Renault Medallion, it's my passion to help others find a safe, reliable car that still puts a smile on their face—even if they're stuck in traffic. When I'm not behind the wheel or the keyboard, you can find me exploring a new city on foot or planning my next trip.