Cadence® system design and verification solutions, integrated under our System Development Suite, provide the simulation, acceleration, emulation, and management capabilities.
System Development Suite Related Products A-Z
Cadence® digital design and signoff solutions provide a fast path to design closure and better predictability, helping you meet your power, performance, and area (PPA) targets.
Full-Flow Digital Solution Related Products A-Z
Cadence® custom, analog, and RF design solutions can help you save time by automating many routine tasks, from block-level and mixed-signal simulation to routing and library characterization.
Overview Related Products A-Z
Driving efficiency and accuracy in advanced packaging, system planning, and multi-fabric interoperability, Cadence® package implementation products deliver the automation and accuracy.
Cadence® PCB design solutions enable shorter, more predictable design cycles with greater integration of component design and system-level simulation for a constraint-driven flow.
An open IP platform for you to customize your app-driven SoC design.
Comprehensive solutions and methodologies.
Helping you meet your broader business goals.
A global customer support infrastructure with around-the-clock help.
24/7 Support - Cadence Online Support
Locate the latest software updates, service request, technical documentation, solutions and more in your personalized environment.
Cadence offers various software services for download. This page describes our offerings, including the Allegro FREE Physical Viewer.
Get the most out of your investment in Cadence technologies through a wide range of training offerings.
This course combines our Allegro PCB Editor Basic Techniques, followed by Allegro PCB Editor Intermediate Techniques.
Virtuoso Analog Design Environment Verifier 16.7
Learn learn to perform requirements-driven analog verification using the Virtuoso ADE Verifier tool.
Exchange ideas, news, technical information, and best practices.
The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information.
It's not all about the technlogy. Here we exchange ideas on the Cadence Academic Network and other subjects of general interest.
Cadence is a leading provider of system design tools, software, IP, and services.
Get email delivery of the Cadence blog featured here
CLARA, Calif.--Holistic system design coupled with gradual design evolution and
sensor fusion will put autonomous vehicles in every driveway. Some day. At
least that was my takeaway from a recent presentation by Nathaniel Fairfield,
the technical lead on Google autonomous vehicle
delivered a keynote here to a packed room of Embedded Vision Summit attendees,
but he was tantalizingly light on technical specifics when it came to sensor
architectures and design roadmaps. Still, he gave glimpses of impressive design
engineering and made a solid case for the robot cars that swivel heads on
highways. He cited potential improvements in:
this technology is undoubtedly amazing already, Google is aiming quite high.
mission is to transform mobility, not make slightly better cruise control,"
challenge is to design a human-like solution that isn't human and to do so in a
noisy chaotic real-world environment at an affordable cost with reasonable
compute requirements. No problem, right?
the time being, both the first Prius- and Lexus-based generations of Google
autonomous vehicles and the more recent fully
autonomous version use a multi-sensor, multi-modal technology to navigate
the world, from radar to sensors, lasers and Google maps.
sense of that data is nontrivial in real time. For example, meshing data from all
those cameras (he wouldn't tell us how many) and sensors together should require
an enormous computational capability, but that turns out not to be such an
issue. In the Lexus case, "they use a standard equivalent of a desktop computer,"
Fairfield said. "It doesn't need a lot because it's on the freeway and there's
a lot of structure (via mapping) we can exploit."
the vehicles sit Velodyne Lidar systems to complement other sensors and
cameras. Can those be replaced with, say, a number of additional (and less
Google is looking into reducing sensor costs, replacing the Lidar is not
realistic, Fairfield said:
fantastic for some purposes but the laser is great--for example, at night. It
just sees in the dark. Otherwise you'd have to surround the car with
spotlights. I don't see one (type) of sensor replacing another. I see us
getting better at integrating them and building that combined system. I really
do think they're all important."
attendee probed Fairfield about the sensor architecture to understand where the
filtering and processing is managed. He kept it close to the vest:
are computationally limited, the sort of theoretically awesome,
dump-it-into-one-massive filter that sorts it all out, that's not really
tractable. We build simpler systems to combine the information and then infuse
it at a higher level."
whether the roadmap (pardon the pun) will reduce reliance on maps, Fairfield
replied, "I'd love to. Longer term that's a good direction (and) it can be a gradual
process. But we have no plans to do it (soon). "
attendee asked about how to mitigate interference among lasers. Fairfield said:
"The radars--off-the-shelf radars-are designed not to interfere with each other. With the
lasers, we have detected interference...it was still less than interference you
get from the sun and other sparkly stuff you get in the world."
If, like me, you're dying to buy one of these driverless cars to ease your mind during lousy commutes and long hauls, you're just going to have to be patient. In the meantime, we can satisfy ourselves by watching a really amazing design evolution happen before our eyes.