Much has been written about the specific techniques that IC designers can use for low-power design and verification, but a larger context is missing. What's the end goal, and what are the costs, benefits, and challenges of implementing power management? In a lively panel discussion at the DVCon conference Feb. 28, three engineering managers and two EDA vendor representatives painted this broader picture - and came up with a lot of great advice for anyone considering low-power design.
The panel was part of a Cadence-sponsored lunch event titled, "Earn Your Degree in the Low-Power Arts and Sciences" (preview blog was posted here). The panel was moderated by Pete Hardee, solutions marketing director at Cadence, and it included these panelists (left to right in photo below):
Photo by Joe Hupcey III
As he introduced the panel, Hardee noted that advanced low-power techniques are becoming increasingly commonplace, and that they come with "all kinds of second order effects in terms of the problems they introduce for verification." Managing these effects with power formals and tools is the "science" of low power design, while the methodology is the "art," he noted.
Here are some snapshots from the discussion.
Q: What are the challenges you see with energy management?
Prasad: The main challenge is to understand the importance of energy management, and to make tradeoffs. Energy management spans architecture, RTL, process, design, implementation, and verification - all aspects of the design flow. To get the most energy efficient design we have to manage hardware and software and understand the architecture.
Castagnetti: We have to do all these complex [low power] things while keeping the same design turnaround time.
Marschner: A lot of developers are getting into power management for the first time. They don't have the legacy of how they did it last time. So there's a ramp-up cost as well.
Q: At what stages of the design flow do you need to nail down low-power techniques?
Prasad: For something like power gating, you have to answer architectural questions before you start putting RTL together. When I shut the power off, how fast do I wake it back up? What latency do I have? Do registers need to retain state? How do I do software save and restore? What clamping values do I need to put on outputs?
Q: Is the cost of these techniques worth it in the mobile world?
Prasad: I think it comes from customer requirements. If you really want a chip with a GHz on your phone, or a particular battery life, you have to work backwards to make sure you can guarantee performance, meet schedules, and avoid burning the chip.
Marschner: There is a cost to manage power. One thing to keep in mind, when building in power management, is that the added logic is going to take more power. There are many layers of this, starting with the software at the top.
Jen: For new companies starting out, my recommendation is to try not to get too complicated. If you have many power domains you might lose more power than you save. Synthesizing with high Vt [voltage threshold] does not necessarily provide the best power - you may do better with normal Vt.
Castagnetti: We mostly build our chips for wired applications. The fact that we're considering things like power gating shows that we're hitting the same wall the mobile guys hit long ago. If I can reduce power by X percent, or go to a cheaper heat sink or lower my fan speed, or run the system more quietly, there's a lower cost of ownership, and our customers are looking for that.
Wang: One interesting observation I have is that many customers don't think too much about the end goal they're trying to achieve - they think about new low power techniques they can adopt. People are looking at techniques like power gating or double-edge flip flops, and forgetting what the costs are. There's no free lunch. You need to think about your project schedule and your expertise level. Think before you say, "my competitor is using this low power technique so I want to use it as well."
Marschner: Yes, be careful that what you adopt is not part of an arms race. But the most expensive thing you can do is to adopt less than you need and end up with dead silicon.
Q: Systems that use power management are ultimately under software control. How do you guys figure software into your verification plans?
Castagnetti: In our case we don't always own the software, so we have to make sure that whatever verification strategy we adopt creates a fail-safe system, and make sure as many cases as possible are covered.
Prasad: The software guys are completely power agnostic. For them it's memory writes and DDR reads, and there is no cost associated with [power]. There is a performance cost, because there is benchmarking for performance and throughput, but unfortunately not for power. There needs to be some kind of communication between hardware and software folks about what can be done on the software side to save power.
Wang: Clearly there is a challenge for the industry to create a platform, or a technology, to drive hardware/software co-verification. This co-verification has special value in terms of power.
Q: How do you drive the system into all the modes you need to check out power, and look at the transitions between each mode?
Prasad: One way to do that is through the power intent file. We define power states and power domains. As part of functional verification, you have to guarantee 100 percent coverage of all these system level power states.
Jen: A lot of the time people just do positive testing. You should also focus on negatives, on things that shouldn't happen.
Wang: People forget about a lot of static verification techniques that can help meet their goals. With exhaustive, formal verification you can prove a lot of easy cases and ease your verification task.
Q: Are you satisfied with standards as they exist today for verification and power intent files?
Prasad: As s designer I would not like to see multiple standards. The second thing is that they should be very simple.
Marschner: I think everyone on this panel is connected to the IEEE 1801 [Unified Power Format] working group in some way. I think what you're seeing is a unification of purpose to come up with a standard we can all use that solves our problems. Ultimately syntax doesn't matter; what matters is getting the tools in place that people can use. If we do that we'll all be successful [applause].
Wang: I agree with Eric, but one thing to keep in mind is that we need to stick to the standard. Otherwise it will be like having a single language that we understand in different ways. It's important to make the semantics really clear.
Q: How does low power design apply to mixed-signal?
Wang: With low power design and mixed-signal, some new challenges come up. For example, you can get an X state from power shutoff. That makes sense to the digital guys, but the analog guys ask, what is that?
Prasad: The challenge is library modeling. As SoC designers we just get a library model we believe is good and correct, but how was that model designed and verified?
Marschner: Everything is ultimately analog. We've been using digital abstractions for a long time and we could ignore analog, but with low power, even in a digital world, we're losing that abstraction.
Q: What's the best approach to power estimation?
Wang: Traditionally people do power estimation using functional vectors, but functional vectors are never the real scenario. To get real power estimation you need to run applications - your games, your videos. Simulation acceleration and emulation technology can help you get real activity data.
Prasad: The idea is to do successive refinement. You come up with a ballpark figure when you have your RTL, as you do gate-level analysis you fine-tune that number, and finally there's post-silicon correlation activity and power analysis.
A Concluding Remark
My biggest "aha" moment came from Qi Wang's comment that many design and verification teams focus on the details of low-power techniques rather than their end goals, and fail to fully consider the costs of those techniques. Those costs may include not only silicon area and performance, but also turn-around time and an added need for verification resources. The first challenge is to have a clear picture of what you're trying to achieve. The "science" of tools and power formats, and the "art" of methodology, can then help you find the simplest and most cost-effective path to reach that goal.
Other DVCon 2012 Coverage
DVCon 2012: Accellera "Town Hall" Meeting Explores Future of EDA Standards
wonderful panel discussion on a topic of utmost importancefor the growth of this industry.kudos to all enlightened panelists for their exemplary panel discussion.hope some more like these will be in the offing in days to come.drhvprasad