• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Blogs
  2. Breakfast Bytes
  3. Future of EDA: Industry...Well, Cadence...Weighs In
Paul McLellan
Paul McLellan

Community Member

Blog Activity
Options
  • Subscribe by email
  • More
  • Cancel
Cadence Academic Network
Stanford
future of eda
Berkeley
Breakfast Bytes

Future of EDA: Industry...Well, Cadence...Weighs In

22 Nov 2016 • 3 minute read

 breakfast bytes logo There was a recent panel discussion at Cadence on the future of EDA. If you didn't see it, then read yesterday's post first, which introduces everyone and has the view from academia: The Future of EDA: The View from Academia.

Anirudh Devgan

anirudh devganAnirudh has to deliver product every day and so he came down to earth a little: What is happening and how does it apply to us at Cadence? He talked about a slide he saw recently that had two boxes multiplied together to give a third. The two boxes multiplied were IoT and networking. To the right was the cloud.

In IoT, Cadence is well positioned. We have a lot of experience in analog design. Networking we know, especially 5G which is racing ahead. Cloud is an area where we have done lots of CPU designs, and machine learning is the new part where Tensilica is well positioned.

He said that in his day at school (CMU) there were no machine learning courses, now it is a department. He took a course in convolutional neural networks. He said that at one level, it is just a non-linear optimization problem, which he already knows. His PhD was on circuit simulation. He mentioned it to his Dad (who was a mathematics professor at IIT Delhi). He said that it was just first-order ordinary differential equations. There is a sense in which that is true, but in EDA we have to solve 10M or 100M at once, so it is no longer pure math, it is the challenge of how to apply at a very large scale. Cadence has done some good work with our "-US" products using 4, 8, 16 CPUs, even 100. But we need to do a better job with massive parallelism, thousands or even hundreds of thousands of CPUs.

We also need to move up a level. EDA always needs to move up a level. As Anirudh pointed out, Alberto has been working on system-level abstractions forever—"before you were born"—but we still need technologies that can work at the system level, and give us a framework like Virtuoso but at the system level, worrying about power, thermal, software interaction, rather than transistors.

Chi-Ping Hsu

chi-ping hsuChi-Ping started off by answering the "Are we there yet?" question with a solid "No." There are 1014 neurons in our brain, the largest chip-based networks are 108, so we have another factor of a million to go. If you look at IBM neural nets, they are 10,000 times worse than the human brain for power efficiency, another long way to go.

He talked about what used to be called the design gap. In design, we cannot handle the complexity that manufacturing can afford. When we were doing 28nm and talking about 16nm, we estimated it would be one month of CPU time to run just one iteration of RTL to GDS.

Three more concrete areas that are opportunities for EDA:

  • Machine learning, of course
  • EDA and IP, these are still pretty separate and we could optimize across that boundary
  • Advanced packaging and design—today we can connect two wafers together and connect with millions of microbumps, but there is no tool to support that beyond the relatively straightforward case of putting memory on a processor

Chi-Ping concluded that the future is bright (do we have to wear shades?) with many challenges. We are still 1M times less than what the brain can do and 10K times less power efficient.

To be continued tomorrow...

Next: Future of EDA: The Q & A

Previous: The Future of EDA: The View from Academia