• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Blogs
  2. Breakfast Bytes
  3. DVCon Keynote: the Past and Future of Verification
Paul McLellan
Paul McLellan

Community Member

Blog Activity
Options
  • Subscribe by email
  • More
  • Cancel
SystemVerilog
Wally Rhines
formal
Verilog
Emulation
DVcon
Rhines
simulation
verification

DVCon Keynote: the Past and Future of Verification

10 Mar 2016 • 5 minute read

Breakfast BytesLast week was DVCon, the design and verification conference. Despite the D standing for design, DVCon is really the main conference focused on verification. The keynote on Verification: Past, Present, and Future was given by Wally Rhines, the CEO of Mentor.

One thing that he pointed out was that the number of verification engineers has been increasing at 3.5 times the rate of the increase in design engineers, at nearly 13% per year. By the rule of 70, that means that the number of verification engineers is doubling every five years.

verification marketThis has driven the growth of verification, in the broadest sense, to a $1B+ industry. The underlying trend is that verification complexity is increasing at 3-4X the rate of design complexity, hence all those verification engineers and all those verification tools.

He started by giving a potted history of verification through the ages, starting from the "rubylith" era when there was no verification at all. In fact, as Wally pointed out, there wasn't really anything to verify against since there weren't really specifications. The datasheet described what the chip actually did, not what it was meant to do.

The next era, in the early 1970s, was CANCER and SPICE, circuit simulators whose descendants are still in use today for analog design and cell characterization. But as the number of transistors on a chip increased, the level of simulation moved up to the gate level. Chips as large as 100K gates could now be simulated. This was what Wally called Verification 0.0.

Verification 1.0 was the era of Verilog and VHDL. The most significant was Verilog, originally created by Gateway Design Automation which Cadence acquired back in 1989. Verilog was also a very high-performance gate-level simulator but the Verilog language, which took verification, and later synthesis, to the RTL level was the real game changer. One thing that is a little lost in history is just how much Verilog simulation performance has improved. It is not just that computers have got faster, although of course they have. SInce 2000, RTL simulation performance has improved by over a factor of 10.

Verification 2.0 was the next era when there was an increasing focus on testbenches and methodology. There were several new languages such as Verisity's e (acquired by Cadence in 2005). But there was a strong desire in the user base for a single standard and eventually everything coalesced around SystemVerilog. In the graph above, it is easy to see how SystemVerilog use has exploded and all the other solutions have largely fallen by the wayside.

Verification 3.0 is the system era where we find ourselves today. It is no longer good enough to do functional verification of RTL. Power has become as important, if not more important, than performance. There is a huge software load in most SoCs and the hardware needs to be verified in the context of the software. But it takes far too many cycles to even boot a modern operating system like Linux or Android to be done by vanilla simulation. So a whole range of verification methodologies have emerged:

  • Virtual prototypes
  • Formal techniques
  • Simulation still, of course
  • Emulation
  • FPGA prototyping

One of the biggest changes over the last decade has been the growth of emulation from a niche market to mainstream. This is partially because the compilation technology has improved a lot and so it no longer takes several months to bring a design up in emulation. But also processor clock speeds have stalled and increased microprocessor performance means more cores. But simulation has difficulty taking advantage of lots of cores due to the need to coordinate time. In addition, emulation has moved from the lab to the data center, with virtualized devices, making it simultaneously accessible to a large number of engineers.

Moving forward, the goal is for all these verification methodologies to use the same stimulus whether it is simulation, emulation, or FPGA prototyping. As Bill Hodges of Intel said, "Users should not be able to tell whether their job was executed on a simulator, emulator, or prototype." Accellera has been hard at work in the portable stimulus working group to make this work across the entire industry.

Looking to the future, security will become a major area requiring verification. The most obvious level is making sure that systems are immune to attacks. This requires both hardware and software, in particular storing cryptographic keys in hardware and verifying that no paths exist that can access the keys in an uncontrolled manner.

Another area of increasing concern is counterfeit chips. Large numbers of overproduced or recycled chips end up in use. This cannot be fixed by verification alone, of course, but circuitry such as on-chip odometers may be required to detect recycled chips, or chips may require a key to enable them the first time they are used.

A third security area is malicious logic inside chips. Wally said that when he talks to security professionals in the US and asks them about this, he is basically laughed at. Of course they are doing it and they assume all the other guys are, too. So we will need to move verification from purely verifying whether a chip does what it is supposed to do to also verifying that it doesn't do anything else, a potentially harder task.

functional safety

Another area for future growth in verification is the increasing functional safety standards such as ISO 26262. As more and more chips go into automotive and aerospace, they need to prove that they avoid systematic failure even in the face of random hardware faults. This requires full requirements tracking, not just to ensure normal performance, but also to assess the risks of faults and verify safety even in their presence.

My conclusion? There is lots of growth left in verification as new areas come to be important and as the challenge of verification grows faster than almost any other aspect of the market.