Get email delivery of the Cadence blog featured here
Dr. Ziyad Hanna, a Cadence VP of R&D in formal and automated verification, was recently appointed a visiting professor at the Department of Computer Science at the University of Oxford, England. Ziyad earned his Ph.D. from Oxford, researching “Abstract Modeling and Formal Verification of Computer Microarchitecture.” During his three-year term there, Ziyad will participate in research with the university’s Automated Verification Group, one of the largest such academic research groups in the field worldwide.
For the design engineering community, this appointment, which came together in part from the efforts of the Cadence Academic Network, is a great opportunity for joint research that can help advance formal verification. I chatted with Ziyad recently about his extensive work so far in formal. Listen in.
What are some lessons and best practices in the area of formal verification that you plan to share with your students?
Formal verification is now one of the fastest growing segments in the electronic and system design industry. In the past, formal was limited to academia and also used for small applications, mainly to verify correctness of “toy” designs. Now, it’s becoming mainstream. Today, formal is used to address complex designs in modern SoCs and platforms. For example, our JasperGold platform covers architectural design, RTL, low power, security, functional safety, system integration, and verification, etc. Our engineers have been able to pinpoint issues in the design flow and find ways to apply formal technology to address these issues.
In the past, we’d try to create a generic solution that solves a problem, but we were not able to cope with the complexity in these problems. Since then, at Jasper and now at Cadence, we’ve created custom solutions for certain problems, which are our JasperGold Apps, to cover areas including x-propagation, low power verification, and property verification, and many more. By segmenting and customizing our solutions, we’ve made them easy to use, effective and scalable. And there are so many potential apps to be created in pre- and post-validation; design implementation all the way from architecture specification down to RTL; and gate-level implementation and layout, including functional aspects in timing, place and route, and noise analysis.
How do you hope this experience will impact your work at Cadence?
Oxford is known as a place that focuses on theory, but in the last few years, we’ve seen more emphasis on practical applications. Formal is in between theory and practice. Working with Oxford, I believe we can open new channels for research and innovation. Researchers from Oxford and Cadence will be able to invent new technologies to advance our solutions to customers.
For Cadence, this opportunity will require long-term strategic thinking of problems that will emerge over the next three to five years and how we can solve them. Recently, we have started to address the emerging needs in the automotive and embedded design space, applying formal on tricky RTL design areas were formal can boost existing methods such as clock-domain crossing, functional RTL analysis, and more. We at Cadence are focusing on integrating our JasperGold technologies to advance other design and verification solutions at Cadence, including simulation and emulation, co-development of common coverage metrics, and signoff. Formal technology can be the “glue” to boost existing methods and make them much better for our users. On the other hand, formal technology can leap ahead while leveraging the other validation and design technologies. The interoperability among formal and non-formal technologies can definitely create new “blend” and new hybrid technologies to advance the state-of-the-art solutions.
What are some key areas of formal verification research that you are currently involved with now?
Scalability is our major concern in this domain. It’s easy to get to a very large number of states in a design that the tool needs to visit and verify – a problem we call “state explosion.” You can imagine that, even in a small block of the design like an arbiter, there are billions of possible states to verify, a number even bigger than the total number of pages on the internet, or larger than the number of polygons in modern SoC designs. This is hard-core research to invent methods that cope with that large state space, reachability analysis, modeling and abstraction — almost every university has research around formal methods scalability, and at Cadence, we have core technologists working on this problem. With the great assistance and dedication of my team over many years, we have achieved great progress to advance the scalability of formal technologies.
Moore’s Law talks about the number of transistors doubling every couple of years. Complexity is scaling even much faster. More transistors yield more capabilities and features in silicon, and each feature has to be verified by itself and against other existing features, creating exponential growth in functional complexity. To cope with this capacity and design size challenge, formal technology needs to scale up. There’s more burden on core algorithms, but it is an exciting domain and we are still scratching the surface.
On the application side, there are many areas to be addressed: software verification, low-power design, and verification for IoT devices, verifying mission-critical applications like automotive and aerospace.
Users need productivity and ease of use, without being bogged down by the complexities underneath the tool. We’re working to make our systems smart enough to guide the user to progress through the steps to the end result. Machine learning and Big Data analysis using formal methods is a new line of research to advance the productivity of large-scale software like what Cadence delivers in the EDA space.
During my research at Oxford, I focused on symbolic execution and high-level modeling technologies aimed at boosting formal verification technologies. I believe this line of research should be continued, together with other software verification technologies.
What drew your interest to formal verification?
Since I was an undergraduate student at Tel-Aviv University many years ago, my attention was towards practical applications of mathematics and theoretical computer science. I always had an interest in finding applications for computer science algorithms, programming, gaming, and verification.
Formal specification and verification was a rising front then, and it got its major attention when the infamous Intel Pentium bug was found in the mid-90s. Formal verification technology, even in its early stage, was able to identify the logic bug and then researchers at Intel and CMU were able to verify the Pentium design fix. I was at Intel then and formal verification started to get major attention from all management levels. I decided to pursue my career path around formal verification, including academic and industrial experience.
During the last 25 years, I have been investing in research and development of tools and solutions to promote the applications of formal technologies in the industry. I’ve co-authored dozens of papers, mentored many research projects, filed many patents, and participated in dozens of invited talks and collaboration opportunities in academia worldwide. At Jasper, I chaired the technology advisory board that included top-niche researchers in formal technologies from top universities worldwide. Together with my colleagues in Cadence’s Formal and Automated Verification Group, we won the best innovation award of 2015.
I’m inspired by modeling and verification, solving hard problems for our customers. Every day I’m fascinated by new apps that this technology can deliver. It’s amazing.
What are some trends that might emerge over the next few years regarding formal verification?
In the last few years, formal has been like an isolated island. Interoperability with simulation and emulation, for instance, has been very limited. We need to be able to apply different methods of interchangeability, to be able to switch from one to another and share results and different validation methods, create a holistic approach.
Abstraction is another area. To cope with design complexities, we need to verify models or systems at a higher level of abstraction. At Cadence, we have high-level synthesis with our Stratus High-Level Synthesis solution, which bridges high-level design and RTL. Applying formal and high-level synthesis and verifying high-level models is another trend of research.
Scalability of formal verification methods will continue to be the core of basic research in formal verification and drive the innovation on reachability analysis, abstraction, and bug hunting technologies. On the application side, software verification for embedded systems and end user applications will continue to attract the research community. Besides that, there are several system-level concerns where formal verification will strive to play a major role. This includes reliability, safety, and security of embedded systems such as in the automotive and aerospace domains.