Never miss a story from Breakfast Bytes. Subscribe for in-depth analysis and articles.
This year's Jasper User Group meeting took place last week. As usual, the meeting was opened by Ziyad Hanna's presentation on the state of the formal. This year, he titled his presentation Formal Verification in the Era of Domain-Specific Computing. The clearest exposition of the transition from the old era of general-purpose computing to the new era of domain-specific computing is Hennessy and Patterson's Turing Award lecture.
Let's start by looking at the era before the era of domain-specific computing. This was, not coincidentally, what has been called the era of "happy scaling" in the underlying semiconductor technology. The combination of Moore's Law and Dennard Scaling, along with the development of architectural innovations enabled by huge numbers of almost free transistors (branch prediction, instruction level parallelism), meant that for a decade we had 1.5X increase in CPU performance per year. Everyone learned that it was silly to build a special-purpose architecture since it was expensive and would soon be overtaken by Intel's x86 line.
This was also the era of a decade of early industrial formal verification (1990-2000) driven by corporate CAD in leading chip design companies. But there was a lack of methodology and a lack of capacity. This was the era when you needed a PhD in formal verification to do any formal verification successfully.
But the golden era came to an end. Dennard scaling ended and, as a result, power density increased to the point that we were looking at the temperatures of a rocket nozzle. The computer architects also ran out of new ideas, so the amount of hardware parallelism slowed and we needed to switch to software-controlled parallelism, also known as multicore.
In this era, formal became industrial strength, driven by startups such as Jasper (which, of course, Cadence acquired in 2014). Formal crossed the chasm from where only experts could use it to where, with the support of experts, hundreds of users became "formal engineers".
Epoch 3, which we are still living in, was the democratization of formal verification, going from hundreds of users to tens of thousands of users, driven by EDA companies like Cadence.
As a result of all these trends, formal usage has exploded just in the last two years, driven by processor verification, and especially by domain-specific architecture verification. Jasper sessions are growing at 43% CAGR!
If we look at the type of problem that is best suited to the AI/ML approach, they are hard problems, that are both data and labor-intensive. Pretty much everything in EDA is like that, and formal for sure is.
This leads to a generic EDA approach using designs to train on earlier designs, reinforcing with current designs, and using inference to optimize how the design process is automated.
In the specific case of model-checking orchestration, it looks like the diagram above.
This is actually a small subset of what Ziyad covered. But this post is already getting long. So let's cut to the bullets from his final slide:
Sign up for Sunday Brunch, the weekly Breakfast Bytes email.