Cadence® system design and verification solutions, integrated under our Verification Suite, provide the simulation, acceleration, emulation, and management capabilities.
Verification Suite Related Products A-Z
Cadence® digital design and signoff solutions provide a fast path to design closure and better predictability, helping you meet your power, performance, and area (PPA) targets.
Full-Flow Digital Solution Related Products A-Z
Cadence® custom, analog, and RF design solutions can help you save time by automating many routine tasks, from block-level and mixed-signal simulation to routing and library characterization.
Overview Related Products A-Z
Driving efficiency and accuracy in advanced packaging, system planning, and multi-fabric interoperability, Cadence® package implementation products deliver the automation and accuracy.
Cadence® PCB design solutions enable shorter, more predictable design cycles with greater integration of component design and system-level simulation for a constraint-driven flow.
An open IP platform for you to customize your app-driven SoC design.
Comprehensive solutions and methodologies.
Helping you meet your broader business goals.
A global customer support infrastructure with around-the-clock help.
More Support Log In
24/7 Support - Cadence Online Support
Locate the latest software updates, service request, technical documentation, solutions and more in your personalized environment.
Cadence offers various software services for download. This page describes our offerings, including the Allegro FREE Physical Viewer.
The Cadence Academic Network helps build strong relationships between academia and industry, and promotes the proliferation of leading-edge technologies and methodologies at universities renowned for their engineering and design excellence.
Participate in CDNLive
A huge knowledge exchange platform for academia to network with industry. We are looking for academic speakers to talk about their research to the industry attendees at the Academic Track at CDNLive EMEA and Silicon Valley.
Come & Meet Us @ Events
A huge knowledge exchange platform for academia. We are looking for academic speakers to talk about their research to industry attendees.
Americas University Software Program
Join the 250+ qualified Americas member universities who have already incorporated Cadence EDA software into their classrooms and academic research projects.
EMEA University Software Program
In EMEA, Cadence works with EUROPRACTICE to ensure cost-effective availability of our extensive electronic design automation (EDA) tools for non-commercial activities.
Apply Now For Jobs
If you are a recent college graduate or a student looking for internship. Visit our exclusive job search page for interns and recent college graduate jobs.
Cadence is a Great Place to do great work
Learn more about our internship program and visit our careers page to do meaningful work and make a great impact.
Get the most out of your investment in Cadence technologies through a wide range of training offerings.
Overview All Courses Asia Pacific EMEANorth America
Instructor-led training [ILT] are live classes that are offered in our state-of-the-art classrooms at our worldwide training centers, at your site, or as a Virtual classroom.
Online Training is delivered over the web to let you proceed at your own pace, anytime and anywhere.
Exchange ideas, news, technical information, and best practices.
The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information.
It's not all about the technology. Here we exchange ideas on the Cadence Academic Network and other subjects of general interest.
Cadence is a leading provider of system design tools, software, IP, and services.
Get email delivery of the Cadence blog featured here
If we want truly energy-efficient servers and mobile devices, existing low-power design techniques are not sufficient, according to Jan Rabaey, professor of electrical engineering and computer science at the University of California at Berkeley. In an animated and provocative keynote speech at the Cadence Low Power Technology Summit Oct. 18, Rabaey said we need to re-think the very nature of computing itself.
Few people know more about low-power design than Rabaey. A professor at U.C. Berkeley since 1987, he's scientific co-director of the Berkeley Wireless Research Center, and his research interests include integrated wireless systems, low-energy architectures and circuits, and supporting EDA environments. He has authored three books on low-power design - "Low Power Design Methodologies," "Power Aware Design Methodologies," and "Low Power Design Essentials."
Rabaey started his talk with a look at the "emerging information-technology platform" characterized by three layers - the cloud, mobile devices, and the "swarm." The latter refers to tiny, inexpensive wireless sensor devices that might perform such tasks as environmental, traffic, or health monitoring. (One of Rabaey's research involvements is the Swarm Lab at U.C. Berkeley - a great story in itself). Rabaey observed that power is the "dominating factor" for each of these three layers, and is a "deal breaker" if we can't reduce energy usage. "The cost of computation is the cost of power, so making that cost cheaper is going to be very important," he said.
Conventional Solutions Not Enough
A number of low-power design strategies have arisen since the early 1990s, Rabaey noted. They can be summarized as follows:
But we're running out of options, Rabaey said. Trends such as power, performance, and energy efficiency are "flat lining." Technology scaling isn't helping much, leakage is getting worse, and variability is a growing problem. What to do? Here is the essence of Rabaey's "energy-efficient roadmap":
1. Continue to scale voltage by reducing supply voltage.
2. Explore new computer architecture ideas, including:
Lowering supply voltage is the"only option" and is something we have to do, Rabaey said. For any design, there is a minimum energy point, "the lowest energy you can run this thing on." That point is below the threshold voltage. The problem is that lowering supply voltage makes the circuit run more slowly, and there's more susceptibility to variability.
Jan Rabaey points to a perpetual sensor system with solar harvesting as an example of sub-threshold operation
There are two options for mitigating these problems. One is to "back off a bit" and run the supply voltage very close to the threshold voltage. The second option is to build circuits that are self-adapting. "Self timing is the right thing to do," Rabaey said. "When I'm done I'm going to go on to the next thing, not just sit there leaking [energy]."
New Architectures - and No Margins
One new architectural idea, Rabaey said, is "energy proportional systems." The idea is that the system consumes power that is proportional to the task at hand. "If you do less, the power scales, and you would hope it scales linearly. If you don't do anything you should not consume anything." Surprisingly, perhaps, most electronic devices still consume quite a bit of energy when they're doing nothing.
Rabaey observed that the energy proportional concept is taking root in data centers, but has not yet come to mobile devices. He believes a 10X power savings is possible in many cases.
The next architectural idea, "always optimal systems," includes system modules that are adaptively biased to adjust to operating, manufacturing, and environmental conditions. The device uses sensors to measure parameters such as temperature, delay and leakage. It re-adjusts supply voltage, threshold voltage, and clock rate. "You basically build a dynamic feedback system - you measure, control and act," Rabaey said. "If you do it right you can get tremendous energy savings, because you always run at the best possible energy curve."
Today's designs, Rabaey noted, are over-designed and constrained by timing and power margins. Tomorrow's designs will be based on a "sense and react" approach and will let designers "eliminate margins and always work at the edge." He acknowledged that "this is a very big paradigm shift, like building a bridge that's on the edge of collapsing all the time. It's kind of scary."
A concept that takes this metaphor further is "aggressive deployment," also known as "better than worst case design." It's based on the observation that worst-case conditions are rarely encountered in actual operation. With aggressive deployment, you might scale your supply voltage more than you should (according to conventional wisdom). Yes, you'll occasionally miss a timing edge, but that's okay if you can detect it and fix it.
A Heretical Approach to Computing
Are these ideas enough to "scale the power wall?" Not really, Rabaey said. What we really need to do is adopt non-deterministic or "statistical" computing for appropriate applications. And that means that the same set of input conditions might not produce the same output every time, an idea that at first glance sounds heretical in the world of computing.
"We have been so brainwashed in the Boolean, Von Neumann, and Turing based models [of computation]," he said. "We think computation has to be deterministic. Is that really true? Thinking about it might help us get to some interesting places."
Non-deterministic computing is not for every application - you don't want it for your bank account, for example. But for "perception based" applications like image processing, video, or classification, it can work and can save a great deal of energy. Rabaey noted that "anything that has to do with a machine-human interface is a statistical effort. A lot of problems don't expect correct or deterministic answers - as long as you're close, you're fine."
Rabaey showed an "Error-Resilient System Architecture" (ERSA) developed at Stanford, and pointed to a real-world application, an image classifier identifying cars from what could be a satellite image. The conventional approach would never miss a car. The ERSA approach could inject 30,000 random errors, achieve 90% accuracy, and use substantially less energy.
Can we do better? Rabaey talked about the energy efficiency of the human brain, noting that it has a computational capacity of 1016 computations per second at 1-2 femtojoules per computation. This is two orders of magnitude beyond what we can currently do in silicon. "It turns out the brain is purely a statistical engine," Rabaey noted. By studying the algorithms the brain uses, he said, we can make silicon-based platforms run more efficiently.
Finally, Rabaey noted that the brain does its computing in an analog fashion, and analog is inherently statistical, never producing the exact same answer every time. But analog doesn't scale well, so if you want extremely high resolution, digital computing is a better approach. However, most applications don't need high resolution or deterministic outcomes.
In summary, this was a fascinating talk that went far beyond the usual discussion of low power. Sometimes we get so lost in design techniques and power format issues that we lose sight of the larger picture. Rabaey is bringing that larger picture to the design community, and is challenging us to think differently about IC and systems design.