Get email delivery of the Cadence blog featured here
Bring together three of the best known and most opinionated voices in EDA - along with a Cadence R&D executive - and what do you have? A spirited panel discussion on electronic system level (ESL) functional verification, along with a call for a "top down" verification approach guided by requirements and use cases.
The panel took place at the System-to-Silicon Summit held Sept. 26, 2013 at Cadence headquarters in San Jose, California. Brian Fuller, editor-in-chief at Cadence, served as moderator. Panelists were as follows, shown left to right in the photo below:
The panel discussion began with a presentation in which Gary Smith stated that a "true" ESL design and verification flow now exists. (This is essentially the same message that was conveyed in an August 2013 webinar, reported in my recent blog post). Basically, the flow started to emerge in 2011, but in 2012 it became evident that power information was lacking. The lesson of 2013 was the need for emulation and acceleration in the flow. The ESL flow is now working but is put together with "glue and bailing wire," Smith said.
Smith also commented on what he called the "crisis situation" in the embedded software tools market, in which most of the major players have gone out of business or been acquired. "Embedded [software] will be part of the EDA business just as IP is now part of the EDA business," he predicted.
Below are some excerpts from the panel discussion that followed.
Q: What do we need to do to make verification smarter?
Bailey: In doing bottom-up verification, we have been concentrating on the notion of exhaustive verification. As we build up a system, many times we spend time and effort verifying something that could never actually happen. By thinking of the problem from the top down, and analyzing use cases with power, we can identify the important aspects of what needs to be verified early in the process.
Hogan: You start with behavior. Out of that you can drive your verification suite and your assertions, and hopefully give yourself a smaller set to go after. If your scenarios capture 90% of the run time of the system, and you feel confident you have verified them, then I think you're ready to go to market.
Smith: In 1996 we came up with the idea of the intelligent test bench. The problem we had then is that we would verify things over and over. The idea was to identify which blocks are already verified and set them aside. We're starting to get intelligent testbench products out the door.
Binyamini: Current bottom-up testbench approaches are pretty good. What is missing today is the top-down approach. The key is to define the requirements. You find use cases and define them in some abstract way that allows you to drive the requirements from the beginning of the design. The only way to define the requirements is against the use cases.
Q: Ten years ago design teams were much larger than verification teams and now they're neck and neck. How are things going to evolve in the next ten years?
Bailey: If we don't change the way we do verification, the number of verification engineers will totally swamp the design team. If we start putting executable requirements and use cases in place, I think we can control it a lot more.
Hogan: One of my companies is building an application-specific processor for a cell phone company. I'm working on an IP verification strategy with that team. We work on things you wouldn't think of as verification. Team synchronization is the hard part. The thing I'm most worried about is managing a large design.
Binyamini: If we ask whether verification is going to outgrow design, we should consider that software is outgrowing all of this by leaps and bounds. We know of companies that have 4X, 8X, 20X more software designers than hardware designers, design and verification combined.
Another point is that most designs, at least in big companies, are actually derivatives of previous ones. If they make a change in one IP subsystem, 80% of the effort can be verification. Verification and design are oriented to IP, and this is an integration or assembly problem.
Bailey: By saying it's an integration problem, we're accepting a bottom-up flow. If things are in place to feed the requirements down through use cases, we can identify which of those use cases have been affected by a design change.
Binyamini: Where there is a top down, there is also a bottom up. Both need to co-exist.
Q: Software is increasing in workload and complexity. Is simulation acceleration running out of steam?
Hogan: Simulation acceleration will always be around because often you can run things in parallel. But simulation is never going to answer the questions emulation does. If you're building any kind of system at all, you're going to have to emulate.
Q: One of the things we're ignoring is that devices now are consumer oriented, and most devices of 25 years ago were not. Older devices were highly constrained, and today's devices are not. Can we constrain them enough to maintain equal numbers of design and verification engineers?
Bailey: Particularly for consumer oriented things, we have to define use cases. You can never define all of them but you have to start the definition where the important set is.
Binyamini: Most verification is going to go up to the software and system level. The use-case driven approach is not just for hardware, it's mainly for software. The problem with software is there is much less efficiency and there is no automation.
Smith: Keep in mind that the embedded software tools market has been in a crisis situation for the last five years. Embedded designers have almost no tools for verification. We're going to have to supply them with two sets of tools. One is a go/no-go tool that says whether you met your power budget or not. Another is a static tool that allows them to design power-aware software.
Hogan: What's the value add of verification? I want the minimum amount of verification, the absolute minimum. You've got to get the spec and the behavior right and exhaustively verify the blocks. Then during integration you can use verification as a confidence builder.
I bought an iPhone4 and my battery was chewed up in two hours. I did a software upload and it went to 8 hours. That's the difference these days - there's a software filter than can extend the life of the [hardware] platform. How does this affect verification? It's got to be sufficient, and if not we'll fix it later in software.
Related Blog Posts
Gary Smith Webinar: "The True ESL Flow is Now Real"
"Unhinged" Complete interview with Jim Hogan