Get email delivery of the Cadence blog featured here
At the no-longer-so-recent Jasper User Group JUG last year, the keynote was by Erik Seligman. He is one of three authors, along with Tom Schubert and the very nautically named MV Achutha Kiran Kumar (MV stands for motor vessel and is used in front of civilian vessels such as cruise ships) of a book on formal verification. The title is Formal Verification: an Essential toolkit for Modern VLSI Design. This book is different than any other book I know of on formal verification since it is not an academic treatise on formal verification algorithms, aimed at people who want to implement or improve them. Instead, it is aimed at people who want to use formal verification tools to build chips.
As I wrote about his keynote:
When Erik first got involved with formal verification (FV), he looked for books, but there were only academic books aimed at people writing formal verification tools, not aimed at people trying to use formal verification in the real world. He thought back to escapes that nearly got through, and the common errors that people make. One of the most common errors made by newbies and experts alike is vacuity, setting impossible constraints that basically mean that any circuit will verify correctly. But there are lots of tricks, rules of thumb, and techniques that people who have been using FV for years have internalized. Since the book he wanted didn't exist, he decided to write it. He wanted something with more of a "hacker mentality" targeted to practicing verification engineers and their management.
The chapters are:
Formal verification has been around for a long time. We had a formal verification product Vformal at Compass Design Automation 25 years ago. But it really came to prominence with the Pentium FDIV bug (in which certain floating point divides might give the wrong answer under rare circumstances, and which cost Intel $475 million to replace all the processors). There was suddenly a realization that simulation was not enough, thus proving that correctness—which is what formal verification is—was essential. When Intel's management basically said "never let this happen again" they truly began to use formal verification in earnest. Simulation is what Sifuei Ku of Microsemi referred to as the British Museum Algorithm: "So, essentially you run fast simulations and then you drill down to specific corners that give you issues. ...You walk everywhere. And if you don’t walk to the right place, you miss something."
The challenge with using formal verification has been two-fold. One, it used to be so hard to use that only formal verification specialists with PhDs on the subject could understand how to use it. The second was that the algorithms (and the computers of the day) lacked the power to work correctly on real-world problems. Even small multipliers could defeat early implementations.
I highly recommend the chapter on formal verification's greatest bloopers. You may not need a PhD to run formal verification anymore, but you are still not going to just take your SoC, throw it at the tool, and have it say yes or no. So you have to do manual things to address the complexity (in most cases, formal can be used for some simple things, too). The big problem is that in telling the tool things to make it tractable to verify (such as blackboxing parts of the design), you risk telling it something that isn't true. The worst case is vacuity, where, for example, you define a condition that is universally false, you risk ruling out all possible inputs, and so all conditions are trivially true. But the chapter has lots of real-world examples that have been learned from a lot of painful experience, I'm sure.
Mentor's CEO Wally Rhines once told me that my book, EDA Graffiti, was the best book on EDA. But I pointed out that as far as I know, it is the only book on EDA. In the same way, this is the best book for the formal verification practitioner. And it seems to be very good.
You can buy the book on Amazon (and yes, there's a Kindle edition).