If you ask Google and the automakers why we need self-driving cars, they say the main reason is safety. Sure, letting a computer handle the driving will be a huge convenience. But consider that more than 90 percent of car crashes are the result of human error, something that, presumably, wouldn’t be a problem for autonomous cars.

“We have 1.2 million people killed worldwide, 33,000 killed in the U.S. per year; that’s incredible,” says Google’s Chris Urmson. “The 33,000 number is comparable to a 737 falling out of the sky almost five days a week, which would be completely unacceptable in air travel.”

Active safety features, such as forward-­collision warning with automatic emergency braking, are already reducing accidents. The Insurance Institute for Highway Safety says the combination of those two technologies has proven to reduce rear-end crashes by about 40 percent. Further safety aids such as rear-­collision-­alert systems, pedestrian detection, and blind-­spot monitors will only serve to help.

In the far distant future, there’s little debate that self-driving cars have the potential to drastically reduce, or possibly even eliminate, crashes. In the interim, as self-driving cars navigate traffic alongside unpredictable human drivers, things will be murky.

Such a situation became clear in February, when a Google self-­driving prototype collided at low speed with a municipal bus. The Google car’s software wasn’t programmed to react intuitively when the car’s progress was blocked by sandbags in the road, just as it was attempting a right turn on a busy boulevard. The Google car then merged back into traffic, calculating that the bus following behind would yield. But the bus did not yield, and the two vehicles collided.

Google, firmly on the side of heading straight to full autonomy, worries that inviting drivers to tune in and out of the task of driving could be a serious problem; give people the option to pay even less attention to their driving, and many of them are bound to bury their head further into their smartphone or laptop.

To suddenly require a clocked-out human to re-engage with driving at a critical moment will probably not end well, according to driving simulations conducted by Stanford University.

In a 2013 interview with Automotive News, the late Stanford professor Clifford Nass explained: “You look away to read The New York Times or watch ‘Les Miz,’ and traffic got crowded, the road surface changed, and it started raining. You have to be responsible for taking over the car. That is a phenomenal mental transition problem.”

Volvo’s case against going completely autonomous is based on its survey of more than 10,000 respondents to date—with 92 percent of drivers still wanting a car equipped with a steering wheel, with the ability to take over the driving at any moment. The summary: The luxury of driving manually must be preserved. Volvo says its goal for the future is not to remove driving but rather to support the driver when the task is less fun—for example, during the daily commute or a traffic jam.

A recent survey by AAA shows that 84 percent of respondents who do not want semi-­autonomous features on their next vehicle said they trust their own driving skills more than the technology.

Mark Rosekind, head of the National Highway Traffic Safety Administration, responded to the dilemma of straight-to-full-­automation vs. the step-by-step process by saying, “We would lose by betting on one or the other path.”

Editor's Note: This article also appeared in the May 2016 issue of Consumer Reports magazine.