When it runs into questionable situations, the human driver can assist; so far, automated cars are all legally required to have a human behind the wheel. With the cars we sample, there were actually two human drivers onboard: A driver behind the wheel and a technician with a laptop in the front passenger’s seat recording data to be fed back into the cars’ elaborately detailed maps to expand the cars’ capabilities. So far, the project director Chris Urmson and the lead software developer Dmitri Dolgov have logged about 200 continuous miles without human intervention. But as Urmson says, “If we log 200 miles without intervening, then we didn’t learn anything.”
If Google has come halfway to developing the autonomous car in the last five years, it still has a long way to go. And every incremental advance becomes harder. But once the car learns to drive in cities, Google engineers say, highway driving is easier to develop. Also, because every trip starts and ends on surface streets, a self-driving car that can drive in the city is more useful than one that can only take over control on the highway.
In typical Silicon Valley fashion, Google is speeding ahead with development of the autonomous car despite speed bumps that keep others plodding along. For example, General Motors and other automakers are working on a consortium to develop vehicle-to-vehicle and vehicle-to-infrastructure Wi-Fi communications to tell cars when to stop and anticipate the actions of others on the road.
The Google car relies on maps. One of the development challenges was to correctly identify traffic lights, for example. While the cameras mounted on the car have no problem telling a red from a green or yellow light, they have to know where to look. So the Google maps know exactly how high off the ground lights are mounted at every intersection, where the lane lines are, and where curbs are. (They don’t use Google Street View cars, because the data they produce is not detailed enough.) The Google car can’t drive autonomously where it doesn’t have dedicated maps. So far, engineers haven’t tried driving on snow, and rain and fog pose some limitations. Interestingly, Google says driving at night actually works better than during the day.
To “see,” the car relies on forward radars and cameras, much like those in advanced pre-collision and lane guidance systems on cars currently on the market, such as the Jeep Cherokee and Mercedes-Benz S-Class. But it adds a roof-mounted laser range finder that scans the road ahead, behind, and to the sides 60 times a second. The car tries to follow the most conservative course of action in responding to every situation, and it updates reactions in real time. It gathers and processes 30 to 40 megabytes of data per second.
Engineers are also working on smoothing out the car’s reactions. For example, in our brief, 25-minute drive, the car stopped fairly abruptly for one yellow light, accelerated slightly through another, then immediately had to brake for the jaywalkers. In another instance, it made a fairly abrupt turn into the entrance to a left turn lane. Passengers who get car sick wouldn’t be happy here—and in this car, everybody’s a passenger.
“All vehicle design in the past has been built on the assumption than every car has a human driver,” Larry Burns, an senior-level engineer who spent his career at General Motors, said. That affects everything from the shape of the car to where the engine goes. “With this car, that’s no longer true,” he said. He estimated that freeing drivers up to do more productive tasks, such as building relationships or creative projects using their cell phones, "could add $2 trillion to the economy.”
We think this technology could provide a huge benefit to road safety and to mobility, enabling elderly and disabled consumers to be independent and productive even when they can’t drive.