Get email delivery of the Cadence blog featured here
For about the first 15 years of my reporting on EDA, which began in 1985, EDA was a dynamic, thriving industry with double-digit annual growth rates and lots of excitement. The last few years have been different. Now the industry press and the blogosphere seem to be filled with negative comments and doom-and-gloom predictions about the commercial EDA industry. As Gabe Moretti wrote in his recent “EDA bashing” blog, “it seems lately that if you want to be considered an authority on the EDA industry you must be negative about it.”
True, we’re no longer seeing the rapid growth of the 1990s. EDA revenues have been in the $4 to $5 billion range for some time, and the latest EDA Consortium (EDAC) market statistics survey report noted that worldwide EDA revenue for the third quarter of 2008 was down 10.9 percent year-to-year. As a sign of the times, the DVCon conference held an “EDA dead or alive?” panel this year. No one would have imagined such a panel 10 or 20 years ago.
Is EDA really facing a life-or-death situation? To paraphrase Mark Twain, I think reports of EDA’s demise are greatly exaggerated. In part one of this blog I’ll address a few questionable claims that have been bandied about lately. In part two I’ll discuss what I see as some of the real challenges facing the EDA industry.
Claim #1: Major electronics OEMs and semiconductor companies are rejecting commercial tools and returning to internal CAD tool development.
Among other places, this claim was debated at a “Build or Buy” panel that I covered for EE Times in 2006, and it was also raised at this year’s DVCon panel. Some background: at a keynote speech at the International Symposium on Physical Design (ISPD) in 2005, Gary Smith – then with Dataquest – said that 27 percent of engineers were using internal tools in 2004, an increase from previous years. Sensing the possibility of a page one blockbuster EE Times story about OEMs turning away from commercial EDA tools, I contacted engineering and CAD managers at large systems and semiconductor companies to find out if the trend was really true.
The results were inconclusive. While some respondents said EDA vendors weren’t meeting all their needs, only a couple said they were increasing internal tool development, and several said they were decreasing internal tool development. I ended up with a much more modest article. Later in 2005, for EE Times’ annual EDA user survey (since discontinued), we added a question about the use of internal tools. A surprising 56 percent of respondents said they use internally developed tools.
But what kinds of internal tools were they using? 81 percent of that 56 percent said “scripts that drive other tools.” This is an interoperability issue, not a wholesale failure of EDA vendors to meet basic needs. The next item on the list was “system-level modeling tools” at 35 percent, which makes sense, as these were early days for ESL and electronic system-level tools tend to be application-specific.
Now we’re in a recession, and as noted at the DVCon 2009 panel, companies are less likely to step outside their core competencies in times of recession. Internal CAD tool development is expensive and risky, and there’s no third-party support. I think you’ll continue to find internal tools where they’ve always been – in niche applications like high-end processor design, or emerging technology areas like ESL. But I don’t see internal tool development overtaking the commercial EDA industry.
Claim #2: Foundries will buy EDA vendors and offer “one stop shopping” for all your chip design needs.
This idea has been around for a while, and was detailed in an EDA Confidential blog posting entitled “Death of EDA as an independent industry.” The posting states that “the most likely scenario will be that foundries like TSMC will buy EDA companies and offer a perfectly integrated flow – Verilog to packaged chip – from one source.” Stated reasons include the consolidation among foundries, the difficulty of developing and supporting foundry process design kits (PDKs) for all the various EDA vendors, and the presumed convenience of “one stop shopping” with a fully integrated, foundry-supported tool flow.
Foundries, however, have yet to show any interest in becoming EDA vendors. If anything, what would make sense is a decision by foundries to offer DFM software optimized for their processes. Foundries could have bought some of the many DFM startups that were around a few years ago, but they didn’t.
I think TSMC’s failure to buy Blaze DFM is indicative of just how hesitant foundries are to get into EDA. The two companies collaborated to develop TSMC’s PowerTrim service for leakage reduction last year. When Blaze DFM reportedly closed its doors in December, TSMC probably could have bought the company at bargain-basement prices, but that didn’t happen – and Blaze was purchased in February by lithography startup Tela Innovations.
Most foundry customers today have EDA tools from most or all of the major EDA vendors, and probably some startups as well. If a foundry bought a large EDA vendor, they’d suddenly be competing with all the other EDA vendors. Would they try to lock fabless customers into a single EDA provider flow? That wouldn’t go over well if fabless customers want freedom of choice for both EDA tools and foundries.
Claim #3: The migration to lower IC process nodes is slowing, and EDA revenues depend on that migration.
There is a belief that migration to sub-65 nm nodes is, or will be, slower than the movement to previous nodes. But what does the data really show? Of course, it’s too early to know about 32 nm and below. At a keynote speech at this year’s DesignCon, however, as reported in EE Times, Mentor Graphics CEO Wally Rhines presented some data from VLSI Research that showed that the movement to 65 nm is accelerating at about the same rate as previous migrations to the 90 and 130 nm nodes.
Even design teams staying at 65 nm or 90 nm have plenty of challenges to solve. Verification is taking up more and more of the design cycle, low-power design has become a priority, and many design teams want to move up in abstraction to boost time to market competitiveness. There’s also a lot of activity right now in analog/custom IC design automation, which is needed even at 130 nm. Thus, there’s a lot more driving EDA revenues than the move to 45 or 32 nm process nodes.
In part two, I’ll move on to what I see as some real challenges facing the EDA industry – challenges that pose both potential danger and opportunity.
In the past I used mostly commercial tools, but now I'm consulting in a CAD Department. Internal CAD does have its advantages. What design wants, they get. As soon as a bug shows up in Bugzilla, multiple developers and designers start commenting on the solution, examining the test case in situ. The developer gets on it, pronto.
Compare this to calling the hotline, establishing your credentials, explaining the situation, preparing and sending the test case, and tracking the PCR. Or you get your big cheese to convince some EDA big cheese to make an enhancement. Then you wait. When it shows up, it shows up.
Daniel -- it would certainly be helpful to know what percentage of tools in a given design flow are internally developed. But we still need to know what kinds of "tools" people are talking about. If the "internal tools" are simple utilities or collections of scripts, just getting a percentage won't tell the story.
To properly poll internal EDA tool development you have to ask a clarifying question like, "What percentage of EDA tools in your flow are internally developed?"
Just asking, "Are you using internal EDA tools?" doesn't get at the vast majority of EDA users that have a mix of internal and commercial tools.