Get email delivery of the Cadence blog featured here
Recently I've had some thoughts about EDA software "quality"
and what that term really means. Clearly it's a much more complex issue than
designing software that doesn't crash. Quality also has to do with usability,
functionality, performance, and other closely related issues, making quality a
formidable challenge for EDA developers who are designing some of the most
complex software in the world.
Earlier this year Cadence and the Semiconductor Technology
Academic Research Center (STARC),
the Japanese semiconductor industry consortium, announced an ongoing
collaboration to leverage STARC's QA regression suites to ensure EDA tool
quality. The press
release mentioned several tools from the Cadence Conformal and Encounter
product lines. STARC tests software for such issues as bugs, CPU time,
accuracy, and proper documentation, potentially eliminating some of the
customer acceptance tests that STARC members would run on their own.
Just because it
doesn't crash doesn't mean "quality"
This kind of testing is concerned about more than not
getting the "blue screen of death" from an EDA tool - it's also about making
sure tools work in flows. To learn more about the quality issue, I first spoke
with Nora Chu, senior product
marketing manager for Encounter, and some members of her team.
In that meeting we talked about several aspects of quality,
aside from the obvious issue of avoiding crashes and core dumps. One is "ease
of use." But what do we mean by that term? One meaning is that the software is
ready to use right out of the box, with minimal training. Another meaning is
that the user experience is consistent, with reproducible and predictable
Another quality issue has to do with flow testing.
Individual tools may work, but is there a coherent flow with relatively
seamless transitions between tools, and with all tools working together to
support the user's methodology? Is a given tool a memory or CPU hog?
Finally, functionality is part of quality, along with
quality of silicon (QoS) and quality of results (QoR). As Nora said, "the
biggest quality aspect is how useful the solution is."
FURPS sets the pace
I had a second conversation with Mitch
Lowe, vice president of the Cadence implementation group and
a member of the company's Quality Core Team. He introduced me to FURPS, a widely-used software
quality model that Cadence follows. As you can guess, this is an acronym. Here
is what FURPS stands for and how Cadence puts an EDA spin on these terms:
It's a tall order, but EDA software quality is an important
issue. It takes quality software to produce quality designs.