How Google's self-driving car sees the road JUL 09 2015
Chris Umson is the Director of Self-Driving Cars at Google[x] and in March, he gave a talk at TED about the company's self-driving cars. The second half of the presentation is fascinating; Umson shows more than a dozen different traffic scenarios and how the car sees and reacts to each one.
It will be interesting to see how roads, cars, and our behavior will change when self-driving cars hit the streets. Right now, street markings, signage, and automobiles are designed for how human drivers see the world. Computers see the road quite differently, and if Google's take on the self-driving car becomes popular, it would be wise to adopt different standards to help them navigate more smoothly. Maintaining painted lines might be more important, along with eliminating superfluous signage close to the roadway. Maybe human-driven cars would be required to display a special marking alerting self-driving cars to potential hazards.1 Positioning of headlights and taillights might become more standard.
Human drivers, cyclists, and pedestrians will necessarily adapt to self-driving cars as well. Some will take advantage of the cars' politeness. But mostly I suspect that learning to interact with self-driving cars will require a different approach, just as people talk to computers differently than they do to other humans -- think of how you formulate a successful search query, speak to Siri, or, more to the point, manipulate a Wii remote so the sensor dingus on top of your TV can interpret what you're doing.
Although if the car is smart enough to parse the arm motions of a police officer directing traffic, it can probably pick out the relatively inconsistent movement of a human-driven car in a second or two.↩