Get email delivery of the Cadence blog featured here
Andreas Kuehlmann is a Cadence fellow, director of Cadence Research Laboratories in Berkeley, and also an adjunct professor at the University of California at Berkeley. In this interview, he talks about current research activities and possible future directions for the EDA industry.
Q: Andreas, what does your work at Cadence involve?
A: My role is one of leading the research we do in the labs and collaborating with [Cadence] business units, as well as interacting with the external research community. Most of my work is related to product groups. I help them connect to the research everywhere from actual coding to helping with product strategy, all the way up to engaging with customers.
I’m also an adjunct professor at U.C. Berkeley, where I teach a course every spring on logic synthesis and verification. I advise three graduate students, and I spend time working with them in the lab.
Q: What’s the role of Cadence Research Labs within the context of Cadence’s overall R&D efforts?
A: We help product groups get products out the door involving different time horizons. We currently focus a lot on shorter-term items that get into the next release or have an impact in the near future. Yet, we also keep an eye on longer-term opportunities and trends.
We have a small, compact organization that really works across the board. This gives us the opportunity to look at synergies between technologies that you could not do in larger product organizations. We have many examples where we could adapt ideas from one technology area to a totally different one, including reusing source code across product lines. It is critical that we do not actually own products, but rather support the R&D organization as a technology provider.
Q: What topics are you looking at in the Research Labs today?
A: Currently we are mostly working on immediate needs. In the front end, many of the Cadence formal verification engines have been developed in the labs. We’re now looking at more advanced engines that raise the abstraction to word-level verification, not just individual bits. We are working on constraint solving technology for simulation, and we’re collaborating with the Palladium team on next-generation compiler technology.
As an example of developing new product ideas, the Cadence C-to-Silicon Compiler actually came out of the labs. In 2005 we moved this into incubation, and last year it hit the road as a real product. We’re still very much involved with helping it to ramp up. We also work closely with the RTL Compiler and Conformal teams. Right now we’re looking into physical-aware low-power sequential optimization, from clock gating all the way to retiming, and addressing the corresponding challenges with equivalence checking.
Q: What work is ongoing in system-level design?
A: We have multiple things going on. We’re looking at transaction-level modeling and what can be done from a synthesis and verification point of view. It’s all about raising the level of abstraction and making it real in practical flows.
Q: Are you looking at embedded software development and verification?
A: We’ve done quite a bit of research on using formal methods for embedded software verification. I think it’s an important area and a huge opportunity. EDA companies, especially Cadence, are in a perfect position to address [embedded software], especially low-level software, where the needs are very close to hardware.
Q: What are the key research opportunities in functional verification?
A: Number one is getting enough coverage in functional verification. Simulation models will hopefully rise in abstraction from RTL to TLM or higher. Number two is how to do equivalence checking. If we raise the level of abstraction, how do we make sure things are equivalent so we do not have to repeat simulation at a lower level? A third point is that embedded software plays an increasing role. Verifying consistency and correctness of the many hardware and software configurations will become increasingly important. Finally, formal methods have always been complimentary to traditional verification methods, and we need to continue boosting their capacity and raise the abstraction level.
Q: What other challenges do you see on the horizon for EDA?
A: One big challenge is the change in the evolution of computing platforms from focusing on performance improvements to using many cores. Until a few years ago we got a free ride with scaling. Now we will see an exponential increase in the number of cores, whether homogeneous or heterogeneous. This is a tremendous challenge for the EDA industry. Not only do we have to parallelize software, adopting multi-threading or using distributed computation, but some of the traditional approaches we’ve taken may change. We will have to explore other paradigms more suitable to parallel computation.
We need scalable solutions. It doesn’t solve the problem for long if we can only scale to two or four cores. The real challenge is 16, 32 or more cores.
Q: What’s hot in academia with respect to EDA?
A: Traditional EDA is not really “hot” anymore in the academic world, not like areas such as biotech. But that doesn’t mean there’s no interest. The question of what to do with 1,000 cores on a chip is a very interesting topic. For example, the Par Lab [Parallel Lab] at U.C. Berkeley has a cross-domain focus. What does a parallel platform bring us? What are the programming challenges and how can we address them? Can we influence hardware development? People are looking to find a compromise between programmability and parallelism. I think the jury is still out on where exactly we’re going.
Q: Is the technology transfer between academia and industry working in EDA?
A: Actually, we have a very close tie between academic research and industrial applications. A lot of research is driven by the needs of the industry. In this respect EDA is better than many other research areas. I think it has to do with the fact that our industry has many roots in academic research and has always stayed closely involved. In addition to big EDA companies like Cadence, companies like IBM and Intel have been very involved and have actively driven directions for research.
Q: Can EDA technology or techniques be extended to areas outside electronic design?
A: Absolutely. I think the EDA industry is one of the most advanced areas in computer science, in terms of pushing computationally hard algorithms onto very large-scale problems. Last year we had a student project on using some of our SAT checking technology on protein folding. That’s something the biologists haven’t come up with yet. It’s just a single example of the opportunities I see.
Thank you for this blog