A self-driving Uber test car SUV that was involved in a fatal crash in Arizona.
This image provided by the Tempe Police Department shows an Uber SUV after it hit a woman March 18, 2018, in Tempe, Ariz.

A report that the lone backup driver inside an Uber self-driving test car was streaming video when the car killed a pedestrian underscores the need for tougher test protocols on public roads, safety advocates say.

This is especially important, they say, as multiple self-driving technology companies and state and local governments forge agreements to allow more test vehicles out among the public.

This week, the Massachusetts Department of Transportation and 14 cities in the state agreed to allow testing of autonomous vehicles if companies enter into a memorandum of understanding, promising that their testing has been safe and “further testing of AVs is reasonably expected to be conducted safely and efficiently.”

Tempe, Ariz., police released new information Thursday about the fatal March crash involving one of Uber’s test vehicles. The police report, according to Reuters and other media, states that Uber backup driver Rafaela Vasquez was streaming a Hulu video at the time the car struck and killed a pedestrian.

The police department, which released the crash details in response to a public records request, said the crash was “entirely avoidable,” adding that the driver could be charged with vehicular manslaughter, according to Reuters.

Humans as Backup

While Uber might have satisfied Arizona law at the time by having a single backup driver in the car, that’s not adequate for safely testing autonomous vehicles, says Bryan Reimer, director of the Advanced Technology Vehicle (ATV) Consortium at the Massachusetts Institute of Technology.

Reimer and others point out that humans not actively engaged in driving can be lulled into inattention, even if their sole job is to pay attention. He stresses that even if backup drivers are paying attention, they’re still passive observers not primed for a quick reaction.

“This driver was set up to fail,” Reimer says. “You’re mindlessly looking at the road ahead of you. It’s hard to keep looking at something when nothing changes. You don’t have to make moment-to-moment decisions about how to turn the wheel.”

Consumer Reports is a member of MIT’s ATV Consortium, along with automakers such as Jaguar Land Rover and Toyota, insurers like Liberty Mutual and Progressive, auto tech supplier Aptiv, and others.

Soon after the Uber crash, the company shut down its automated-vehicle testing. Last month, Uber announced that it was pulling its testing program out of Arizona and refocusing its efforts near engineering centers in San Francisco and Pittsburgh.

There have been accidents and mishaps with testing programs, but the Uber incident was the first death from a self-driving test car.

more on autonomous driving

The Tempe police report says distraction was a factor in the crash that killed the pedestrian, Elaine Herzberg.

During Vasquez’s ride in the Uber vehicle, which was recorded on video inside the vehicle as part of the testing, she looked down 204 times, mostly in the direction of the lower center console near her right knee, according to the police report. She was looking down for 5.2 of the final 5.7 seconds prior to the crash, the report says.

A log of Vasquez’s account provided by the video-streaming service Hulu, under a search warrant, showed that “The Voice” was streaming on her account in the final 43 minutes of the drive and that the streaming ended at 9:59 p.m., the approximate time of the collision, the police report says. 

The police concluded that the crash wouldn’t have occurred if Vasquez had been paying attention to the roadway, and indicated that she could be charged with vehicular manslaughter. Details from the police report were published Thursday by the Arizona Republic, Reuters, and other media outlets.

Uber has said it’s in the middle of a top-to-bottom review of its safety culture, including operating procedures for its vehicle operators, led by former National Transportation Safety Board Chairman Christopher Hart.

“We continue to cooperate fully with ongoing investigations while conducting our own internal safety review,” Uber said in an emailed statement. “We have a strict policy prohibiting mobile device usage for anyone operating our self-driving vehicles. We plan to share more on the changes we’ll make to our program soon.”

Companies developing self-driving technology vary widely in how they’re testing cars. Waymo made a decision early in its testing program to aim first for full automation in its software after its trained employees couldn’t stay focused on the road as the cars drove themselves.

Other companies, such as Uber, are rolling out technology quicker and testing it in real-world situations on public roads, with the fail-safe that human drivers are in the front seat, watching the road and ready to react if needed in an emergency.

After Uber announced that it intended to resume testing, Pittsburgh laid out a set of conditions for the company, including a strict 25-mph speed limit for test cars to make collisions with pedestrians more survivable. Pittsburgh also wants Uber’s app to alert human drivers when they’re exceeding speed limits.

David Friedman, director of cars and product policy and analysis safety at Consumers Union, the advocacy division of Consumer Reports, says the Arizona crash shows that Uber’s approach is dangerous. He echoed Reimer, the MIT director, that it’s human nature to stop paying careful attention during stretches of monotony.

“This tragedy was entirely avoidable because Uber never should have put such an incapable vehicle on the road in the first place,” Friedman says. “Uber, Waymo, and all other companies developing self-driving cars need a much better approach than requiring a single driver to effectively watch paint dry.”