A Google-modified, self-driving Lexus RX drove itself into the side of a public-transit bus on Feb. 14 in Mountain View, Calif.—the first recorded instance when one of the company’s autonomous vehicles actively contributed to an accident. No injuries were reported in the low-speed collision, but the crash serves as a reminder that, even though self-driving prototypes such as Google’s are operating on public roads, the technology still has a long way to go before these vehicles can navigate every situation.

That’s precisely why Consumers Union, the policy and advocacy arm of Consumer Reports, along with other safety advocacy groups, has been calling for the National Highway Traffic Safety Administration (NHTSA) to commit to maximum transparency and public involvement as policy and safety standards are developed covering autonomous vehicles.

Google has been operating its self-driving car program on public roads in California for 15 months to help its cars map the terrain and learn from real-world driving situations. The company has filed regular, detailed reports with the California Department of Motor Vehicles, and the program has logged 424,331 autonomous miles.

During this time, a human driver has had to assume control of these purportedly self-driving vehicles more than 340 times, an average of 22.7 times a month. In these instances, the self-driving technology failed and automatically ceded control to the human driver 272 times. Another 69 times, the driver felt compelled to intervene and take control.

This clearly shows that, despite the sophistication of the software that guides Google’s vehicles, driving in the real world is fraught with a complexity that these cars are not yet capable of handling on their own.

And the accident that took place on Feb. 14 was indeed a complicated situation. Google’s 2012 Lexus RX 450h was trying to make a legal right turn on a red light, when it encountered some sand bags placed near a storm drain. The car stopped, waited for some cars to pass, sensed an opening, and attempted to go around the sand bags. Google says the car had detected an approaching bus, but assumed the bus would yield. But the bus did not stop, and the Google car made contact with the side of the bus at 2 mph. The DMV accident report shows the bus moving at about 15 mph. 

Google self-driving Lexus RX SUV
Photo: Google

Up to this point, Google has been proud that none of the crashes of their prototypes has been the company’s fault. But Google took partial blame for this incident, calling it a “classic example of the negotiation that’s a normal part of driving.”

Still, Google emphasized that even its test driver, who saw the bus in the mirror, figured the bus driver would slow or stop to allow them to merge.

“In this case, we clearly bear some responsibility,” read Google’s February monthly self-driving car report, adding, “because if our car hadn’t moved, there wouldn’t have been a collision.”

Google says it has reviewed the incident in its simulator and already made refinements to its software. “Our cars will more deeply understand that buses (and other large vehicles) are less likely to yield to us than other types of vehicles, and we hope to handle situations like this more gracefully in the future.”

There is no doubt that Google will learn from the incident and make its self-driving vehicles smarter because of it. But according to Jake Fisher, Consumer Reports director of auto testing, the crash should offer a reality check on some of the near-term expectations surrounding self-driving technology.

“Self-driving cars don’t act like humans; they can’t make eye contact, or give hand signals with other drivers to help get through odd situations,” Fisher said. “We are enthused, and frankly impressed, by the progress that is being made and how this helps advance safety features in today’s cars, but we expect that a world where true autonomous cars are the norm is still decades away.”

In January, NHTSA released a statement that pledged within six months it would work with manufacturers and outside experts to develop a policy around self-driving technology. But since then, neither NHTSA nor its parent, the Department of Transportation has held any public proceedings on the issue.

Consumers Union and other consumer/auto safety groups are asking NHTSA to hold public meetings, provide an open, public docket for information, and collaborate with advocates about policy and standards for self-driving cars. And the Google car incident underlines the need for the government to open up that process as soon as possible.

Google has shown its support in a statement provided to Consumer Reports: “Self-driving cars have the potential to save lives and improve mobility for those who cannot drive. We agree that the public should be included in an open and transparent process that gets this technology safely onto our roads.”

Consumers Union is joining other consumer advocacy and safety groups in delivering that message to the Department of Transportation today. (Read the pdf.)