As of July 1, 2021, Google will discontinue the RSS-to-email subscriptions service.
Hence, the email alerts will be impacted while we explore other options. Please stay tuned for further communication from us.
The Electronic System Design Alliance CEO Outlook took place on May 16. Bob Smith, EDSA Executive Director, opened the show with apologies from Aart de Geus, who was unable to attend since Synopsys' earnings call was the following day. That left Ed Sperling, the moderator, with:
This is my summary of the event. I've not tried to be word-for-word accurate, so this is paraphrased and abridged. Just in case you don't recognize everyone, the above photo shows:
We've seen a big shift from designing a chip for functionality to designing a chip around the flow of data, and issues connected with the data. How does that affect design?
Joe: one of the most surprising things a few years ago was when Apple was the first company to come out with a 64-bit application processor. Until then, people had always talked about 64-bits as an address space issue, but Apple didn't do it for that, it did it for power efficiency reasons. So you can think of that as designing the silicon to the software that is going to be running on it. So design is increasingly taking a look at what the end-user application is. This is changing both design and validation, which have to extend up to validating a user software stack operating in the real world. So much more holistic about how you optimize design.
Simon: This makes it a big challenge as to how you architect your design, and it is an increasing challenge as to how you optimize the architecture. Because it is about dataflow and hetergenous architectures, this requires high-level simulations and running much larger datasets through. It's about how efficiently the system can use the data and re-use the data once it's on-chip. This makes simulating these designs much more complex, and we need to be able to make tradeoffs between the components that we might put in these systems. So there are lots of design challenges in this data-driven era that we're in.
LBT: I tend to agree with Simon. We are moving into a data-centric era, with the whole infrastructure from cloud, network, storage, all at scale. AI, machine learning, data analytics are all moving to what I call "software 2.0" basing software on data and a model. This is a new era.
Dean: I think it is one of Amdahl's laws that for every increment in CPU power you have to have a certain amount of I/O and that law's been pretty accurate. When we did the first 8-bit processors, they had 8-bits of I/O and 8-bits of memory bandwidth and now the number of pins has gone up and is pretty proportional to the amount of CPU capability that we have. With the introduction of GPUs we went a little more parallel and started to get a lot more data in. But the design of the CPUs and the processors has always been driven by the applications on top. Even back in 1989/90, at Apple we looked at the code and the graphics rendering and designed the CPU to make them go faster. That's what Intel does every day all day. So I'm not convinced that there is a drastic change going on here. Clearly, the AI stuff that Lip-Bu mentioned is permeating the industry. The TPUs and GPUs require more data bandwidth and so getting the data in and out is a major bottleneck. You can't even change models since there's not enough bandwidth to swap them in and out. But this is not a huge divergence from the trajectory of history.
Babak: If I can add a little bit to what Simon said, clearly there are a lot of architectural changes—we have reconfigurable designs, designs are much bigger, more complex clock and power management, vendor liability enters the picture as the complexity of verification can lead to some reliability issues. Architecturally people are building recovery mechanisms into the designs, further increasing the complexity. Silicon design companies are busy innovating with new design methodologies and at the same time EDA companies are innovating to keep pace. All this is creating opportunities.
Simon brought this up, but we are shifting from homogeneous planar designs to multi-chip and multi-die designs, and to vector and matrix architectures. But to your point, Dean, this is still a big jump in evolution. What does that mean for design and what's needed from a tool standpoint? And from a manufacturing standpoint?
John: I guess I can take the manufacturing thing and someone else can come back on the design side. Test used to be at the end of the flow but with these advanced packaging techniques it is now in the middle. So the manufacturing flow is now much more complex and the risk is on the product groups and not on the foundries so much. So there is a need for more sophisticated screening approaches, more data is required in that assembly step, and integrating across the supply chain is now a requirement to drive reliability, as well as getting costs and yields to where they need to be. It's now a much more mechanical process whereas it used to be much more of a chemical process.
Babak: When you try to integrate many of these parts, you find that you can't do it due to incompatibility of processes. So you end up with multi-chip modules. How do you manage the dataflow in these designs? A lot of effort will be spent on two aspects of this, what the EDA developers have to do, and how you use AI to reduce the dataset to get to the 6σ or 7σ that we need. And the other level is how to simulate the packaging and the interaction of the chips—radiation effects, thermal effects, IR drop. But also the dataflow that has to go through the whole package.
LBT and Joe, you both work in companies that have tried to pull a lot of this together. But as you get more domain-focused, sometimes there are in much smaller lots?
Joe: They all want to have some aspect of system design co-optimization, tools that will help you partition the design between all these different elements that you are going to integrate into the package. They need help on decisions on process technology, communications bandwidth, even stuff like pin placement. Then down the other end of the stack you deal with issues like "How do you get Gerber to talk to GDSII?" Packaging is a Gerber world and IC is a GDSII world. So bringing together the packaging world and the IC world presages a whole lot of opportunities and challenges we need to speak to in the next couple of years.
LBT: In some ways the complexity of the design will increase a lot, verification will become more difficult. But, to Joe's point, you are trying to address the whole system, not just the silicon. So system analysis becomes very critical.
Prakash and Dean, you are working in areas where more and more needs to be brought together. How is it working in a world with the platforms from the big three EDA players?
Prakash: A lot has been said about the verification complexity by Joe and Lip-Bu, and that is true. But time to market remains the same and so the productivity pressure on the designs is much higher. This is an area where much of the innovation is occurring in design. We are seeing a push-button shift left where you automate more of the verification earlier because at each generation the verification goes up by 10X. This is not just pushing the old methodologies, but new approaches. At Real Intent, our focus in on static signoff. As you said, a challenge is to integrate into the larger platforms. The difference between verification tools versus design tools is that the output of verification tools is consumed by humans, so integration is all on the input side and not on the output side.
Dean: So Ed, your question was how do you tie all the tools together. All major customers using IC Manage are using a mix of the three major vendors, and they all have a flow or a methodology, which is considered very proprietary and where they are very resistant to change. But it is all about managing your data, and the folks that win all have severe methods for managing their data globally.
Another big topic we've seen is the introduction of chiplets: IBM, Intel, AMD. Simon, what problems do you see down the road?
Simon: It's back to the complexity challenge and how you build ever more efficient systems. Ideally, you want to integrate everything onto one die...well, you can't do that. So integrate it into a package. It is possible to do this, but it is still pretty hard today. But this will become a technology that is used by lots and lots of people. There's obviously an opportunity for the EDA industry here to abstract away from the physical issues you have to deal with and to make this a much more commonplace design approach. This is the way that we will keep integrating architectures into the future and delivering more and more performance. There's a role around standardization to make it easier to mix-and-match some of these chiplets. In the next 5 to 10 years, all these issues will get resolved and it will become a common design technique.
Babak, you are neck-deep in this world, trying to integrate these pieces. What problems have you encountered?
Babak: There are standards being worked on, especially in inter-die interconnect bridges. There is a lot of opportunity for EDA to work in this area, but also for standards bodies.
John, how do these designs even get tested?
John: To go back to something Babak said, we've been driving a lot of standardization on the equipment for the assembly flow, since it is still very human-intensive. But it is becoming "lights out" like the front-end fabs and so the data collection is very important because you need to understand what happened. Some components have chipID which identifies where they came from, but there are a lot of other components such as passives and sensors, with no traceability. Having traceability through that flow is very important and becoming a requirement. Testing has become an area where you need to be able to pass data up and down. Right now, this is all a bespoke solution for each product. We need to make this a more mix-and-match flow that is not so engineering intensive.
Right now, semiconductors are interesting to the investment community, and even the general public. The President of the US talks about it, and many other folks. There's a shortage, but we should use this opportunity to communicate what is the roadmap in front of us, because what Moore's Law did for the world was communicate that we are going to continue to do things that are exciting and better, But we should do a better job of communicating what is the roadmap in front of us, while we have the advantage that the world is looking at us.
Dean: When do we get John Kibarian's Law?
John; I'm not that smart! It has to be someone else. But I do think that it is really valuable.
Dean: There is a lot to be said for putting up predictions, and they are very different from what they were before because we are pushing up to physics limit.
Simon: One of the things that is both challenging and makes it more interesting is that the industry isn't all going in one direction right now. It is no longer "can you make smaller, faster, lower-power transistors?" because we have an explosion of edge endpoint tiny things, massive compute going on in the cloud, new network technologies are evolving that require really funky RF wireless, so it's going in lots of directions at once. It is multi-dimensional. And, to John's point, let's take advantage of our industry being in the spotlight and get more people going into it, because let's face, we're all too old to solve these problems. We need the next generation.
Dean: I like your comment, Simon, about going in multiple directions. We have three poles, the edge/mobile/IoT pole, the data center pole, and the AI pole. We haven't had so many strong poles historically.
Simon: I think the biggest pole is the electrification of everything. Power electronics is an area that is suddenly going to become really interesting because we have to re-wire the whole world.
Babak: And another one is "software-defined everything". So many areas and it is limiting to have just one approach.
Given the electrification of everything, what does EDA become? Is it even an industry anymore or is it something else?
Dean: I think EDA will continue much as it has for 50 years, and it is going to hit a new stride in value with the addition of learning systems and AI. There is also a transition of EDA moving to the cloud slowly. Huge migration of wanting to use the cloud for some things, and on-prem data centers for other things.
Simon: EDA is there to abstract physics problems to allow designs to be created as quickly as possible I don't see any reduction in the demand for EDA, so as an industry, we're entering a new phase and the problems are not getting any simpler.
LBT: It is an exciting time to be in the EDA business since there is a lot of demand. EDA is expanding due to design activity, the complexity, packaging, chiplets, process nodes chasing down 2nm now, I see tremendous design activity, we just need to get more talent to join.
Prakash: I'm more in the front end, but it's still rapidly evolving. One big change is the bigger companies are doing more innovation and there are fewer smaller companies, but the industry is driven by innovation. But we need to make the industry more attractive to work in and attract talent. Technically, over last 20 years, I think it has reduced a bit. But it's a great place and I love being in EDA.
Joe: There was a time a few years ago when most customer meetings we would be told to make our existing tools faster. But now customers want to take on the whole thing from system, technology scaling, 3D. We are getting huge amounts of demand to move into more, not just more of what we already have. We need to scale design productivity in ways it never has been before. It's going to be an industry that is rife for innovation.
If you think about system companies like Google or Apple creating their own chips. How does that change the dynamics where each one of you sits. Are you having broad horizontal approaches or for smaller niches?
Simon: Obviously we focus on the core block of a microprocessor. But one change is that people interact with us to solve a bigger problem that involves optimizing the whole problem, of which processors are just a part.
Dean: Customers are pulling us into new tools and new ways of doing things. The datasets are so large we need new ways to split up the data, move it to the cloud and back, and create new analytics.
LBT: We have about 45% of our revenue from system and hyperscale companies. AI cuts across all verticals, and is especially important in medical, scaling to looking at the treatment of millions of people. And the heart of innovation there is semiconductor.
Let's draw to a pause here. It's a really interesting time in this industry.
Sign up for Sunday Brunch, the weekly Breakfast Bytes email.