A top researcher in autonomous driving technology is telling a cautionary tale right now and urging the world to slow down in its headlong race toward a self-driving car. In sum, her message is simple: there’s still too much to do and too much to learn before we launch self-driving vehicles on a very unsuspecting public. It certainly is contrary to Google’s “hell-bent-for-leather” approach to self-driving technology. Indeed, according to testimony at Senate hearing last week, Google’s Chris Urmson urged that all roadblocks to autonomous technology be eliminated so that self-driving vehicles and related technology can be deployed as quickly as possible.
Urmson’s testimony before the Senate Committee on Transportation, Commerce and Science came just after Google, now a subsidiary of Alphabet Inc. admitted a bit of responsibility for the first reported accident in which a “Googlemobile” struck a city transit bus in Mountain View, Calif. It was the first reported case of a Google research vehicle striking and causing damage to another vehicle. There have been 17 other reports of accidents involving Google’s test fleet, but the accidents have been minor and have been the responsibility of others.
Google, in explaining the accident, said that the operator and software of the Google test vehicle expected the bus to slow down and allow the test SUV to pull into the traffic lane. The Lexus 350h was operating in autonomous mode and apparently could not allow for the fact that the bus was maintaining speed, not slowing to allow the test vehicle to pull in, as its software apparently expected. The result was the vehicle struck the side of the bus.
Google said it had corrected the software glitch that allowed the accident to occur. It further said, its fix allowed for future interactions between autonomous and non-autonomous vehicles. Of course, Google’s faith in its technology and software is absolute, if not somewhat myopic as it seems to be focusing solely on the technology and not the bigger picture. However, Google is not alone. It is just one of many companies in the race to attain what they apparently see as their Holy Grail, autonomous vehicle technology. Witnesses from Lyft, Uber, General Motors, Delphi and others all regaled the committee with their versions of automotive Nirvana, getting the driver out of the picture and letting cars drive for themselves.
Throwing some serious cold water on the flaming plans of the proponents, robotics expert and professor at Duke University Missy Cummings, said slow down, things are moving much to fast. She said that self-driving cars are absolutely “not ready for widespread deployment.”
The larger picture was presented quite starkly yesterday by Prof. Cummings, whose credits include being one of the nation’s first front-line Navy fighter pilots from 1988 to 1999; managing a $100 million Navy program to create a robotic helicopter, and, now, heading Duke’s Humans and Autonomy Lab. On the research level, Prof. Cummings is heading up a program researching how people interact with self-driving cars. The program is funded by the prestigious National Science Foundation. Prof. Cummings spokes, at length, with Automotive News.
Prof. Cummings believes the auto industry can learn a great deal from the airline industry which underwent a major operating environment change starting more than three decades ago.
“I think the auto industry could learn a lot from how airlines and airplane manufacturers worked to automate their planes, and tested them to be sure that they would work in all conditions,” Cummings said in a long interview. “We would have never allowed people to fly in airplanes when the industry was still trying to figure out automated landing. The planes had to be tested, and manufacturers had to prove they could land under all sorts of different conditions.”
Prof. Cummings’ view contrasted the majority of autonomous driving tech supporters who backed quick elimination of steering wheels and other obvious signs of human interaction, brakes, for example, from cars. Urmson, in his testimony before the Senate panel, urged lawmakers to eliminate such impediments from self-driving technology from cars.
Prof. Cummings countered that “I believe that before we take drastic steps such as taking steering wheels out of cars, the car manufacturers need to prove that a human will never need to intervene. We’re simply not going to go to a car with no steering wheel overnight. We will get there eventually — I just think it’s not going to happen as quickly as Google might want.”
The most troubling variable in the automation equation for the industry is the driver. There are so many potentials that can go wrong. If, however, the decision is made to leap for automation quickly, then it is to be an all-or-nothing choice. “That would be my first choice – for everyone to to into an autonomous car. We’re just not ready, and I haven’t seen any test data to suggest we are.”
She pointed to the hand-off interface between machine and driver and said vehicular automation has to be “good enough to at least get the car into a safe position. The car can never assume that when it needs to hand off control, the human will be ready at that instant.”
Prof. Cummings noted that “context is important.” Continuing she noted that “if a traffic policeman is gesturing and a car can’t interpret the gesture, it could slow down and vibrate the seat and ask the human to take over. It’s not critical that a human take over in that case. If they don’t the car can stop … But, if a car is going 65 mph and a car is having trouble deciding whether to get off the interstate, it can’t just say: ‘Three, two, one, now take over.’ A car would need to say: ‘Click this button if you’re ready to take over,’ and if the [driver doesn’t], the car will need to be able to come to a safe stop in some way.”
Getting to the point of the hearings, Prof. Cummings indicated the National Highway Traffic Safety Administration (NHTSA) needs to “start setting guidelines for testing and set certain levels of certification so companies know what’s expected of them, and know what is a safe enough system under certain conditions. The most important thing from my perspective is that we do move down this path. Self-driving cars are the future. There’s not question that is where we should go. The question is: How do we get there? And how fast do we need to get there?”