Get email delivery of the Cadence blog featured here
CLARA, Calif.--Holistic system design coupled with gradual design evolution and
sensor fusion will put autonomous vehicles in every driveway. Some day. At
least that was my takeaway from a recent presentation by Nathaniel Fairfield,
the technical lead on Google autonomous vehicle
delivered a keynote here to a packed room of Embedded Vision Summit attendees,
but he was tantalizingly light on technical specifics when it came to sensor
architectures and design roadmaps. Still, he gave glimpses of impressive design
engineering and made a solid case for the robot cars that swivel heads on
highways. He cited potential improvements in:
this technology is undoubtedly amazing already, Google is aiming quite high.
mission is to transform mobility, not make slightly better cruise control,"
challenge is to design a human-like solution that isn't human and to do so in a
noisy chaotic real-world environment at an affordable cost with reasonable
compute requirements. No problem, right?
the time being, both the first Prius- and Lexus-based generations of Google
autonomous vehicles and the more recent fully
autonomous version use a multi-sensor, multi-modal technology to navigate
the world, from radar to sensors, lasers and Google maps.
sense of that data is nontrivial in real time. For example, meshing data from all
those cameras (he wouldn't tell us how many) and sensors together should require
an enormous computational capability, but that turns out not to be such an
issue. In the Lexus case, "they use a standard equivalent of a desktop computer,"
Fairfield said. "It doesn't need a lot because it's on the freeway and there's
a lot of structure (via mapping) we can exploit."
the vehicles sit Velodyne Lidar systems to complement other sensors and
cameras. Can those be replaced with, say, a number of additional (and less
Google is looking into reducing sensor costs, replacing the Lidar is not
realistic, Fairfield said:
fantastic for some purposes but the laser is great--for example, at night. It
just sees in the dark. Otherwise you'd have to surround the car with
spotlights. I don't see one (type) of sensor replacing another. I see us
getting better at integrating them and building that combined system. I really
do think they're all important."
attendee probed Fairfield about the sensor architecture to understand where the
filtering and processing is managed. He kept it close to the vest:
are computationally limited, the sort of theoretically awesome,
dump-it-into-one-massive filter that sorts it all out, that's not really
tractable. We build simpler systems to combine the information and then infuse
it at a higher level."
whether the roadmap (pardon the pun) will reduce reliance on maps, Fairfield
replied, "I'd love to. Longer term that's a good direction (and) it can be a gradual
process. But we have no plans to do it (soon). "
attendee asked about how to mitigate interference among lasers. Fairfield said:
"The radars--off-the-shelf radars-are designed not to interfere with each other. With the
lasers, we have detected interference...it was still less than interference you
get from the sun and other sparkly stuff you get in the world."
If, like me, you're dying to buy one of these driverless cars to ease your mind during lousy commutes and long hauls, you're just going to have to be patient. In the meantime, we can satisfy ourselves by watching a really amazing design evolution happen before our eyes.