Never miss a story from Breakfast Bytes. Subscribe for in-depth analysis and articles.
There were two keynotes during DesignCon, one by Ben Gu, Cadence's VP of the System Analysis business, and the other by Devin Billings of Boston Dynamics, which I will cover later this week. There was actually a third, on the tutorial day before the conference proper, but I didn't attend that day (it was about Quantum Computing). The picture at the start of this post is me with Chiphead (or perhaps he is Mr. Chiphead, or even Dr. Chiphead). It's a somewhat odd name since the blue traces on his face are clearly PCB routing, not IC routing. So he should be PCBhead.
Ben's biggest challenge with the keynote was getting to San Jose from Texas, where he lives. You might know that there was an ice storm in Austin the day before, and many flights, including three of his, were canceled. His keynote was titled The Intelligence to Design Intelligent Machines.
Ben started with some history, going way back. First, there was human muscle power. Then we added human brain power, which enabled us to develop tools, made with stone, then bronze, then iron. Eventually, we added machine muscle power, such as excavators and tractors. Then machine brain power, computers of increasing power. Now we have powerful enough computers to make AI feasible. The basic ideas of AI have not changed all that much in decades, but now we have powerful enough CPUs and GPUs to make those ideas feasible in practice.
Ben moved on to AI inside Intelligent System Design. The fear: AI will make us all obsolete. The opportunity: it will make us all more productive. The basic expectation: the technology will work and we will get better PPA, better verification closure, faster design cycles, and so on. There was a panel session later in the day on the topic of whether AI will replace engineers. I will cover that in its own post.
EDA has been a driver of designer productivity for decades. Back when the planar integrated circuit was first invented, manual design really meant that. Designs would be sketched on paper, masks would involve cutting rubyilith, and the whole manufacturing process was more like an artisanally created piece of pottery. Then we got transistor-level design, polygon-pushing layout editors like Calma GDSII and SPICE (or SPICE-like) circuit simulation. That gave us roughly an order of magnitude increase in productivity. The next order of magnitude increase in productivity came with cell-based design, a flow based around synthesis and automated place and route. I think everyone expected we would move to higher levels of abstraction in the language we used to describe designs, but that never happened. Instead, the next level of productivity came from reusable blocks, called design reuse when it was within a company, or IP when it was one company creating the reusable blocks and another deploying them. The next order of magnitude in productivity increase is going to come from AI-based EDA.
He went on to talk about various areas where Cadence has experience with AI-based EDA, although keeping to the rules of DesignCon keynotes, he didn't mention any product names. Above is one example, debugging SoC verification failures. Cadence Verisium is focused on this area, which involves prioritizing which are the most important falures, tracking down the bug, finding the root cause for those, and then fixing the issue (which, surprisingly, is often the easiest step once the problem is well understood).
Another example is the digital full flow. Cadence Cerebrus, the product that must not be mentioned, not only produces results faster but produces better results. The previous methodology took many engineers many months. With Cadence Cerebrus, one engineer in just ten days could get a 20% PPA gain. Note that time-saving. Even if there was no PPA improvement, going from several months would be game-changing. Even if it still took the same many months, a 20% PPA improvement would be game-changing. Both at the same time is whatever is above game-changing...match-changing?
Here are a few more examples. Note that the productivity improvements are not 5%, they are 5X. Some of the power reductions are quite extraordinary. A 28.5% improvement in leakage on a chip that goes in a battery-powered device is an enormous increase in standby time (when your phone, say, is not being used but is on, listening for calls or text messages).
Ben had several more examples from other parts of the design process. Here's one more, from a very different area, namely PCB routing. Again there are both huge improvements in productivity (50 hours to 20 minutes is 150X improvement), and the wire length of the PCB traces was reduced by 16%.
A key technology that underlies all these technologies and more that Ben talked about, and even more that he didn't have enough time to get through and he cut out of his slide deck, is the data platform. This is the Cadence Joint Enterprise Data and AI (JedAI) Platform. One message of a fundamental AI platform like this is that the best results come from using tools that all work seamlessly together off tools all trained on the same data.
The final message: AI is changing everything, producing large gains in both productivity and the quality of results. But it is also the case that this is only the beginning. The start of the modern era of AI has a very explicit date, namely ImageNet Large Scale Visual Recognition Challenge 2012. This was the event where the neural network approach raced past the algorithmic approaches to image recognition. It is referred to as "the Imagenet Moment". However, that was just a little over a decade ago. Since then, the approach to EDA algorithms (and many other domains outside the scope of Ben's keynote or this post) has been completely upended.
Sign up for Sunday Brunch, the weekly Breakfast Bytes email.