Get email delivery of the Cadence blog featured here
Recently I wrote about a panel discussion that looked at ways of bridging the gap between analog and digital design. This blog post resulted in a lengthy discussion in a LinkedIn group that brought up the topic of verification. One commentator noted that analog and digital designers have very different interpretations of "verification," and concluded that "people have been trying for years to squeeze the round analog peg into the square digital hole."
This comment was a response to a panel discussion in which panelists were asked the following question: "In the digital world we talk about how 70% of the effort is spent in verification. How much time is spent in analog verification?"
Panelist Navraj Nandra (Synopsys) replied that the 70% estimate "is not far off but the color of verification is different." He explained that customers in the analog world are looking for things like IV numbers, and they expect silicon characterization.
Panelist Mladen Nizic (Cadence) noted that we need to distinguish functional verification from signoff verification. "If we talk about digital, it's usually just functional verification," he said. "We need to invest more in analog/mixed-signal functional verification. I see more and more adoption of digital techniques like coverage driven verification and random stimulus. This combination is really needed." For signoff verification, he said, a hierarchical approach can alleviate the need for heavy SPICE simulation runs.
Analog "Verification" is Part of Design
The LinkedIn commentator noted that there may be a more fundamental difference. While digital engineers view "verification" as a separate step, analog designers simply see it as part of the design process. The analog engineer develops an architecture, runs simulation to see if the architecture meets the specs, modifies the architecture if not, and runs the simulation again. Thus, verification is an iterative process, not something handed off to someone else.
If verification is part of the design process in a pure analog world, this could explain why we don't see separate verification teams for analog, as we do for digital. But wait a minute. The world is no longer pure analog or pure digital -- it is increasingly mixed-signal. Even "analog" IP blocks are likely to have some digital control logic, and nearly all "digital" systems-on-chip (SoCs) contain some "analog" (and mixed-signal) IP blocks.
It is at this intersection of analog and digital that a new verification paradigm is emerging -- at least, new for the analog folks. Techniques such as verification planning, random test generation, coverage, metric-driven verification, and assertions are in use today in mixed-signal settings, and they are working. Proof of this came in a Design Automation Conference 2011 panel that I moderated (photo below) at the Cadence booth in which engineers from Qualcomm, NXP Semiconductors, and LSI described how their companies are bringing digital verification techniques into the analog/mixed-signal world.
DAC 2011 mixed-signal panel (photo by Joe Hupcey III)
Among other topics, the engineers discussed real number (wreal) modeling, which allows ranges of analog values to be represented in digital simulation environments; the need for separate analog verification teams; the use of verification planning, analog coverage, and analog assertions; mixed-signal design with the Universal Verification Methodology (UVM); and the need for analog/mixed-signal language extensions to SystemC, SystemVerilog and UVM. You can read my report of the discussion here.
Of course, SPICE simulation is still needed at the analog block level, and probably will be needed for many years in the future. Multi-core implementations of SPICE, like the Virtuoso Accelerated Parallel Simulator (APS), can be of great help in this regard. What is needed is a range of analog/mixed-signal modeling and simulation techniques, from SPICE to Verilog-AMS to real number modeling, so engineers can make the proper tradeoffs between speed and accuracy (below).
When it comes to mixed-signal SoCs, a separation between "analog" and "digital" is no longer relevant, and no IP block should be treated as a black box to be thrown over the wall to the other side. The interaction between analog and digital circuitry is one of the greatest sources of errors, and when you add low-power design techniques to the mix, it's even more hazardous. Thus, a thorough mixed-signal verification is essential, and it must be completed in a reasonable period of time.
How can this be done? What is needed is an integrated mixed-signal design and verification environment that combines the best of both analog and digital worlds, preferably using a common database such as OpenAccess. And digital techniques such as executable verification planning, assertions, and metric-driven verification must come along with this environment, along with wreal modeling at digital simulation speeds. This is how we get away from the "round peg into the square hole" problem. A point tool approach won't do it.
But even as analog and digital worlds become increasingly intertwined, we need to remember that words like "verification" can carry very different meanings in our respective worlds. In addition to adopting the right tools and methodologies, we need to learn to speak a common language, or at least begin to understand each other's dialects.
Note: The recently published book Advanced Verification Topics has a detailed chapter on using metric-driven verification and UVM-MS (UVM with mixed-signal extensions) for analog/mixed signal design. You can read my review of the book here.
What a grand article.. I especially liked the comment that the color of analog verification is different. Thank you :)