Get email delivery of the Cadence blog featured here
If you want to know how challenging mixed-signal verification really is, the best thing is to listen to the people in the trenches. A March 3 lunch panel at the DVCon conference, sponsored by Cadence, allowed an attentive audience to do just that.
The panel included three users and two vendor representatives (left to right in photo below):
The panel was moderated by Ken Kundert, co-founder of Designer's Guide Consulting (and a previous Cadence fellow and developer of the Spectre simulator). Kundert set the tone by noting that the digital verification world has been largely separate from analog/mixed-signal design, with the analog portion of a system-on-chip (SoC) typically treated as a black box. "I'm hoping that as time goes forward we can light up that black box and make it part of the whole verification effort," he said.
Kundert noted just how different analog and digital design and verification really are. Analog designers focus on performance verification, while digital designers focus on functional verification. Also, analog design is schematic-based, while digital design is text-based. When SoC teams combine analog and digital blocks together, modeling is challenging and verification capabilities are limited.
Users Cite Challenges
Chen (Qualcomm) noted that there are two kinds of verification. Performance verification, used by analog engineers, has to do with things like noise distortion. Functional verification, used by digital engineers, looks at whether things are connected properly. "There are tools for performance verification, and on the digital side there are lots of functional verification tools, but when you hit functional verification for a large chip neither of these toolsets works well," he said.
When an analog designer draws a schematic, Chen noted, no behavioral model is created. Thus, when an analog block is moved into a digital simulation, a behavioral model must be hand-crafted. Modeling is "where the challenge lies," he said, and one big challenge is writing a model that will work across all simulators.
Aprameyan (Micron) works with NAND flash design, which, as he noted, has the complexities of both mixed-signal design and digital design. "We rely on digital simulation to get good coverage, but we also have to rely on analog simulation to make sure performance is met," he said. "There are very unique challenges when we have a lot of digital and a lot of analog blocks."
Sarkinen (Medtronic) focuses on the integration of IP into mixed-signal SoCs. It's complex, he said, because there may be numerous digital interfaces driving analog circuitry. "If we were to simulate each block standalone, there would be a potential to miss deeper interactions," he said. He also noted that a "vertical reuse methodology" is needed so that the same model can be used for low-level, block-level, and chip-level simulations.
Despite the challenges, the user panelists are making progress. Chen described how Qualcomm engineers develop a verification plan, come up with a modeling strategy, "interrogate" reluctant designers in order to "extract" modeling specifications, use models in both digital and analog environments, and go through a final design review of the models before running regressions.
At Micron, Aprameyan is using assertions for analog verification. "You can trigger an event in an assertion in a Verilog-A model that, depending on waveform values, can spit out an error message in text form. So rather than a waveform you're looking at a log file," he said.
Sarkinen said that Medtronic has "a methodology in place that's pretty much borrowed from digital simulation." The company uses the Cadence Specman environment for digital verification in combination with Verilog-AMS wreal models. Further, engineers can use the same testbench to simulate models at different abstraction levels.
Kundert asked the vendor representatives to give their perspectives. "The analog world is getting to the point where one has to introduce the kind of methodology that digital verification is using," said Kashai (Cadence). "Most analog guys verify things manually...we need automation." He mentioned UVM-MS (Universal Verification Methodology - Mixed Signal) as a methodology that makes it possible to develop a verification plan, generate stimulus, and monitor coverage. (Cadence presented a paper on UVM-MS at DVCon).
Bakalar (Mentor) spoke of the need for an "integrated verification flow" for both analog and digital realms. He cited several essential elements of this flow, including instrumentation that can monitor verification results, new solutions for debugging, and improved performance for analog simulation.
What Users Want Most
Each of the user panelists had a "wish list." At the top of Aprameyan's list is the need to speed up analog simulation. In addition to assertions, Micron uses the wreal data type with Cadence simulation tools, he noted.
Sarkinen would like to see analog simulation that's simplified enough for digital designers to use. He'd also like to see better convergence and debugging. Chen called for a modeling language that is "portable" and "adequate" for his needs. In response to a question, he said that Verilog-AMS doesn't meet these criteria. One reason is the restrictions the LRM places on wreal models.
These three users are probably far ahead of most design teams when it comes to forging workable mixed-signal verification strategies that take advantage of the features of digital verification. The challenges they articulated are daunting - but the progress they've made is, to borrow Kundert's terminology, helping light up the "black boxes" that stand in their way.
Photo by Joe Hupcey III