• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Blogs
  2. Breakfast Bytes
  3. DAC: Opening Lunchboxes and Closing Mixed-Signal Verifi…
Paul McLellan
Paul McLellan

Community Member

Blog Activity
Options
  • Subscribe by email
  • More
  • Cancel
DAC
analog
mixed signal
56dac
spectre x

DAC: Opening Lunchboxes and Closing Mixed-Signal Verification

26 Jun 2019 • 7 minute read

 dac logo The analog/mixed-signal lunch at DAC got moved to Monday this year, since we had made the Spectre X Simulator announcement that morning. For details on that, see my post Spectre X: Same Accuracy, New Speed. The topic this year was Closing Analog and Mixed-Signal Verification in 5G, HPC, and Automotive. The first challenge was finding the room where the lunch was being held. The smart way was to go out the back door of the convention center and walk about twenty-five yards. I didn't go the smart way, and went out the front of the convention center and walked quarter of a mile around all the buildings in Las Vegas summer heat. I ran into Semiwiki's Daniel Payne who seemed to have found an even less optimal route.

The panel was moderated by Professor Georges Gielen of KU Leuven in Belgium. He leads a research group on analog/MS design automation. This is becoming increasingly important since most things are now networked and have more and more sensors.

The panel was:

  • YY Chen of Mediatek, leads a circuit technology CAD team
  • Atul Bhargava of ST Microelectronics, in charge of analog flows
  • Roopashree (Roopa) HM of Texas Instruments, lead analog EDA responsible for entire flow for all analog products
  • Vinod Kariat of Cadence, R&D lead for all circuit simulation R&D development...such as the Spectre X Simulator

Georges started off by asking each panelist to give a brief introduction.

Roopa went first and said that TI is in almost all vertical segments: personal, medical, industrial, automotive. When it comes to mixed-signal simulation verification, it is especially interesting in industrial and automotive, with low voltage and high voltage on the same chip, the need for reliability simulations, and analog fault coverage.

Vinod, as R&D lead, talks o many customers with many different profiles. Often he would wind up at the same customer with Paul Cunningham who runs digital verification. About 80% of CPU usage during a design is spent on analog and digital verification. Advanced-node customers have to deal with an increasing number of parasitics. Automotive is especially complex, with aging, reliability, and safety. On top of that there are power-saving modes since lots of devices work on batteries. So there are many different low-power voltages, including near threshold, which leads to a huge amount of verification. For analog, circuit simulation is the workhorse.

So we need to continuously improve speed and capacity. Today we announced the Spectre X Simulator, we parallelized it.

YY worries about how to great a good model by working with the system guy to understand and compress the behavioral model. One challenge is that it is hard for a single person to understand both coding and circuit simulation. They are also investigating how to use AI to speed up simulation.

Atul said ST, like TI, has a very diversified portfolio, but what is common on these SoCs is that design is getting more and more complex. It's not just automotive that needs to be more reliable. The industry is “reliable hungry”. New nodes have new layout-dependent effects, and the growth in the number of transistors and parasitics means that they need a solution that guarantees accuracy but at the same time has performance.

Georges agreed that functional verification of analog circuits currently goes through circuit simulation. He wondered if the tools we have are sufficient?

Vinod said that it is endless. When we get something we always need more. Design complexity is increasing, so there really is a need for more, not just in terms of the tool but also in methodology.

Roopa feels that simulation is never-ending and we've got to a point that analog/MS needs a more formal coverage approach. Otherwise you just do more and more without real justification. Of course, faster is good, more cores are good. But more formal in the mixed signal space would be a good vector for the future.

Vinod pointed out that in terms of problems in SPICE, we can routinely simulate a million devices and millions and millions of parasitics. Sometimes even 8-10M devices we can do. The thing that happens is we improve performance and capacity but there are always bigger problems.

As to formal methods, we don’t have much of that today in the analog space. Formal methods are very strong in digital. For analog, need some fundamental technology breakthroughs. We need innovations we can build on. We have a lot of art in analog design layout compared to digital.

George wondered if Vinod is keeping up with the growth in complexity.

Vinod reckons we are ahead of the circuit complexity, but other aspects have grown too fast such as the number of modes, vectors, and more.

Roopa agreed that we are far from formal for analog, but the type of integrated approach we use for digital verification could be used for analog too.

Atul said that for him 3 sigma has gone to 4 sigma, and sometimes even 5 or 6 sigma. Analog blocks have become a lot more complex, and it is not just the size of the circuit.

George wondered if AI will help us.

 Atul reckons it has promise, but still has to be proven. This is most acute in automotive where we no longer talk about parts-per-million but parts-per-billion.

One area YY thinks there is scope for AI to improve things is at the digital/analog boundary including power nad timing:

once a designer updates their schematics we need to do everything again and again. It’s just not efficient.

Georges moved on from AI to behavioral models, and wondered if there is any evolution towards making model generation more systematic.

YY wants some way to "equivalence check" the model to make it clear whether it maps the behavior of the schematic or not.

Atul agreed, but also SPICE keeps getting better.

20 years ago, true SPICE was good enough, then we did some trials with Spectre X so get 4X gain. The number of transistors has gone from 30,000 to 1M. Of course there is room on the behavioral side where we can go even faster.

Vinod is proud of his team and agrees these have been good gains. As design complexity goes up, the engineering team tries to keep ahead. The holy grail is somehow to automatically create the behavioral model. There are some assisted models but it depends critically on the skilll of the team. Some teams use it a lot and others won't go there at all. It's not the skill of a circuit designer, not the skill of a programmer, it's somewhere in the middle.

As we approach the end of Moore's Law, Georges wondered whether it will break the tools.

Vinod doesn't see any problem down to nodes like 3nm. But "eventually we get to single electrons and we will need to change."

Everyone agreed that it's not broken...at least yet.

"What about fault simulation?" Georges asked.

Atul said that today we're doing digital fault injection for analog circuits and we're close to solving that.

But having an analog fault in an analog circuit we’ve not really addressed. For example, a resistor changing dynamically, or a transistor. We’re pretty far from there. But for digital we’re close to being solved.

Vinod has been hearing discussion of analog faults for 10 years but there is no real agreement on what the models should look like. Digital fault simulation has a healthy background of techniques in the literature, but in analog we don't even have a clear notion of "covering" a fault. So there is a lot of research to be done. It is nascent. But we do have more capability than we had three years ago.

Georges' final question was for each of the panelists to state their highest priority between speed, capacity, and accuracy. And a second question, "What are you not doing today due to limited speed, capacity, or accuracy?"

Atul thinks it depends on the domain so he couldn't give a single answer. For medical, accuracy comes first. But that's not the case for every application. There is no 100% accuracy anyway, people are always looking at a lot of statistical data.

YY was clear that functional check is the most important.

Vinod thinks we provide sufficient accuracy today. "I don't usually get challenged that we need more accuracy." The challenge is that for any accuracy requirement, people want to increase speed and capacity, too.

Roopa was okay with accuracy too, but she wants an environment where she can control the simulations more easily. She wants to run tens of thousands of simulations and find the outliers. She wants the simulation environment to be more intelligent so she can run more outliers.

Daniel Payne in the audience asked about speeding up SPICE with specialized hardware. Vinod answered that the hardest part is making the economics work. The reason the economics work in digital, is that you take a design that is fairly static and do a lot of verification such as booting a microprocessor. With analog workloads, you are constantly changing the circuits and run a lot of small simulations. We are looking at using GPUs and AI chips, but for now, he feels that the economics are not clear.

Atul agreed. The variety of simulations are just too large, and some cells are just seconds to simulate. "I wouldn't make the investment for what is not that critical a problem."

With that, the time was up, and we all headed back to the Las Vegas Convention Center. Via the smart route.

Sign up to get the weekly Breakfast Bytes email: