• Home
  • :
  • Community
  • :
  • Blogs
  • :
  • Breakfast Bytes
  • :
  • vManager: One Manager to Rule Them All

Breakfast Bytes Blogs

Paul McLellan
Paul McLellan
23 Jun 2020
Subscriptions

Get email delivery of the Cadence blog featured here

  • All Blog Categories
  • Breakfast Bytes
  • Cadence Academic Network
  • Cadence Support
  • Custom IC Design
  • カスタムIC/ミックスシグナル
  • 定制IC芯片设计
  • Digital Implementation
  • Functional Verification
  • IC Packaging and SiP Design
  • Life at Cadence
  • The India Circuit
  • Mixed-Signal Design
  • PCB Design
  • PCB設計/ICパッケージ設計
  • PCB、IC封装:设计与仿真分析
  • PCB解析/ICパッケージ解析
  • RF Design
  • RF /マイクロ波設計
  • Signal and Power Integrity (PCB/IC Packaging)
  • Silicon Signoff
  • Spotlight Taiwan
  • System Design and Verification
  • Tensilica and Design IP
  • Whiteboard Wednesdays
  • Archive
    • Cadence on the Beat
    • Industry Insights
    • Logic Design
    • Low Power
    • The Design Chronicles
Paul McLellan
Paul McLellan
23 Jun 2020

vManager: One Manager to Rule Them All

 breakfast bytes logoHere's a high-level view of verification:

If everyone properly plans their verification project, why do quality problems and schedule slips persist? It really comes down to the adage “Begin with the end in mind.” A good plan contains detailed goals using measurable metrics, along with optimal resource usage and realistic schedule estimates.

That could have been written this morning, but I'm actually quoting from a Cadence piece in EETimes...in 2005. As Alphonse Karr said in Les Guêpes, "plus ça change, plus c'est la même chose", usually translated as "the more things change, the more they stay the same".

But actually, a lot has changed in verification. In 2005, we didn't have the portable stimulus standard (PSS). Formal verification was in its infancy. If you wanted to use emulation, you could expect to need months to just bring up the design. If you wanted to use FPGA prototyping, then you had to start by purchasing some FPGAs. RTL simulation was then, as now, the workhorse of verification. In fact, the task for a verification engineer mostly consisted of manually kicking off simulations and then analyzing the results. But a better way was already beginning. Back in 2005, Cadence had just acquired Verisity, and thus an early version of the vManager tool. But it was a tool for a single user.

One thing that truly hasn't changed since 2005, or 1985 for that matter, is that designs continue to get larger. Just this morning as I write this, I watched a video where Jensen Huang, NVIDIA's CEO, announced their latest AI chip from his kitchen. It is 54 billion transistors. Larger chips don't just mean that the verification engines need to constantly improve to handle larger designs, it means that the tools that orchestrate the whole verification process need to constantly improve, too. While originally for a single user, today Cadence's vManager Verification Manager is a tool for a whole global enterprise.

Deploying Multiple Engines

A good analogy is a delivery company. Obviously they need vans and planes. But it's not really about the quality of the vehicles, it is about how they are deployed. A plane is not better than a van, it is different and complementary. In the same way, formal is not better than emulation or simulation, it is different and complementary. The vManager platform is the heart of how these engines get deployed, the heart of the verification flow. How you drive the engines is as important as the engines themselves. In some sense, it is the fourth engine.

I'm going to take the verification engines as a given. If you want more details, the latest posts that I wrote on them are:

  • Xcelium: Parallel Simulation for the Next Decade on SystemVerilog simulation
  • Intel and PSS...and Simics, a Blast from My Past on PSS
  • The Dynamic Duo on Palladium Z1 emulation and Protium X1 FPGA prototyping
  • The 2019 Jasper User Group and Machine Learning in JasperGold on formal verification

vManager

As I said above, the vManager platform was originally created to support a single team at a single site. That doesn't really describe how most companies operate today. There are many projects at different stages of design running in parallel. Design groups are geographically dispersed. A lot of verification is run on large server farms or in the cloud. In the 15 years since the vManager platform was a tool for a single user, it has gone through several changes, but I'm simply going to skip to where it is today.

Today, the high-availability version of the vManager platform works at the enterprise level, distributed across multiple projects, as in the diagram above.

 Further, it works across geographically dispersed sites, handling all the usual issues with distribution such as failures of servers or networks, keeping the important data synchronized but also keeping a lot of locality to maintain high performance. This includes pushing out to manage jobs in the cloud. These aspects make the vManager platform truly an enterprise-class solution.

There is a lot more to running a job than just firing up a copy of Xcelium simulation:

  • Prepare regression
  • Request farm resources
  • Initialize hosts
  • Load data
  • Actually run the jobs
  • Save data
  • Generate reports
  • Failure triage
  • Design/fix tests

The vManager platform automates all of this. There are some non-trivial aspects hidden behind some of those bullets. For example, "request farm resources" consumes over one-third of verification time with jobs waiting for resources to be available. Worse, the jobs that run for the longest times also have the longest delays (because they tend to need more resources and so they are in shorter supply). Costs and runtimes vary from server to server, too, so optimizing the allocation of a cohort of verification jobs onto a complex verification environment is more complex than it might seem.

Verification Productivity

The heart of the vManager platform is the verification plan. The reason for this is what was in the opening quote from 2005: "a good plan contains detailed goals using measurable metrics". Back in that era, the vManager platform was a tool for a single user, but it was already a lot better than the "competition", which was a bunch of spreadsheets and scripts. Today, with the enterprise-grade version, a verification plan can be run across different regions, farms with different technologies, and with high availability.

 Another enterprise aspect of the vManager platform, especially in automotive, aerospace, and defense, is connecting verification to requirements management systems such as DOORS (from IBM), REQTIFY (from Dassault), or Jama. For large projects, these can be used to capture and track requirements across the entire system. The vManager platform can link into these to tie verification of a given piece of functionality back to the requirement that it thus satisfies.

Summary

As SoC designs become more complex with increasing functionality, the verification of the SoCs becomes a very challenging problem. Compounding this issue are the time-to-market pressures that lead to urgency in testing and delivering products to the market in months instead of years as previously done. In order to address the large-scale needs of today’s SoC verification effort, a full-featured, centralized, scalable, and flexible solution is necessary. That solution is the vManager platform

The benefits of this approach are why 18 of the top 20 semiconductor companies use the vManager platform to power their verification plans.

 

Sign up for Sunday Brunch, the weekly Breakfast Bytes email.

Tags:
  • featured |
  • formal |
  • Protium |
  • Palladium |
  • xcelium |
  • JasperGold |
  • simulation |
  • vManager |