Cadence® system design and verification solutions, integrated under our System Development Suite, provide the simulation, acceleration, emulation, and management capabilities.
Verification Suite Related Products A-Z
Cadence® digital design and signoff solutions provide a fast path to design closure and better predictability, helping you meet your power, performance, and area (PPA) targets.
Full-Flow Digital Solution Related Products A-Z
Cadence® custom, analog, and RF design solutions can help you save time by automating many routine tasks, from block-level and mixed-signal simulation to routing and library characterization.
Overview Related Products A-Z
Driving efficiency and accuracy in advanced packaging, system planning, and multi-fabric interoperability, Cadence® package implementation products deliver the automation and accuracy.
Cadence® PCB design solutions enable shorter, more predictable design cycles with greater integration of component design and system-level simulation for a constraint-driven flow.
An open IP platform for you to customize your app-driven SoC design.
Comprehensive solutions and methodologies.
Helping you meet your broader business goals.
A global customer support infrastructure with around-the-clock help.
24/7 Support - Cadence Online Support
Locate the latest software updates, service request, technical documentation, solutions and more in your personalized environment.
Cadence offers various software services for download. This page describes our offerings, including the Allegro FREE Physical Viewer.
The Cadence Academic Network helps build strong relationships between academia and industry, and promotes the proliferation of leading-edge technologies and methodologies at universities renowned for their engineering and design excellence.
Participate in CDNLive
A huge knowledge exchange platform for academia to network with industry. We are looking for academic speakers to talk about their research to the industry attendees at the Academic Track at CDNLive EMEA and Silicon Valley.
Come & Meet Us @ Events
A huge knowledge exchange platform for academia. We are looking for academic speakers to talk about their research to industry attendees.
Americas University Software Program
Join the 250+ qualified Americas member universities who have already incorporated Cadence EDA software into their classrooms and academic research projects.
EMEA University Software Program
In EMEA, Cadence works with EUROPRACTICE to ensure cost-effective availability of our extensive electronic design automation (EDA) tools for non-commercial activities.
Apply Now For Jobs
If you are a recent college graduate or a student looking for internship. Visit our exclusive job search page for interns and recent college graduate jobs.
Cadence is a Great Place to do great work
Learn more about our internship program and visit our careers page to do meaningful work and make a great impact.
Get the most out of your investment in Cadence technologies through a wide range of training offerings.
Overview All Courses Asia Pacific EMEANorth America
Instructor-led training [ILT] are live classes that are offered in our state-of-the-art classrooms at our worldwide training centers, at your site, or as a Virtual classroom.
Online Training is delivered over the web to let you proceed at your own pace, anytime and anywhere.
Exchange ideas, news, technical information, and best practices.
The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information.
It's not all about the technlogy. Here we exchange ideas on the Cadence Academic Network and other subjects of general interest.
Cadence is a leading provider of system design tools, software, IP, and services.
Get email delivery of the Cadence blog featured here
At the recent Synopsys EDA Interoperability Forum,
the opening session focused on a 10 year review of standards and
interoperability between EDA tools. Three speakers -- Philippe Magarshack
(Central R&D Group VP, STMicroelectronics), John Goodenough (Vice-President
of Design Technology and Automation, ARM) and Jim Hogan (Private Investor) -- reviewed
their predictions from roughly ten years ago, and commented on how standards
have evolved since then and whether and how they enabled the interoperability
Philippe Magarshack kicked the day off by
reviewing some of the slides he had used ten years ago. They included the well
known Sematech/ITRS Design Productivity gap, and at the time, Philippe had
asked for enablement of RTL to layout flows, IP reuse and analog IP. For future
issues, he pointed to HW/SW co-development methodologies and system-level
entry. He noted how the methodology of platform-based design has addressed and
enabled a lot of the issues in question.
The rest of his talk focused on
system-level design, which has developed quite a bit since 2002. At that time
ESL, more specifically SystemC TLM, was used prior to the IP RTL freeze for
hardware verification. By 2004 ST had expanded the use of ESL to verify
hardware SoC integration prior to the SoC RTL freeze. In 2008 they adopted
pre-RTL debug of HW/SW validation.
The next step: in 2012 Philippe expects
TLM platforms to be used in the Request for Quotation (RFQ) phase of projects, for
marketing purposes, and for bring-up of full software stacks prior to silicon
availability. Today more than 1,000 users are utilizing the ST internal
infrastructure and libraries for HW/SW design and verification, and hundreds of
TLM or mixed TLM-RTL platforms. So with the last ten years representing a
success, Philippe defined the 2012 priorities in OSCI/Accellera to be focused
on model configurability and control, interrupt handling, address map
management, point to point communication and system-level synchronization. For
IP-XACT, he said, the focus should be on removing limitations on "busdef,"
enablement of parameterized IP, and a unified power view.
Beyond 2012 the biggest challenges
identified by Philippe are (1) moving more software engineers from the post- to
the pre-silicon development phases, (2) enabling HW/SW architects with
specification-level models for earlier commitment on functionality,
performance, power and safety/security, (3) verification reuse from
specification-level to SystemC/TLM to VHDL/Verilog RTL and (4) easy integration
of 3rd party IP TLM models, which will require evolution of the TLM
2.0 standard. Overall the takeaway from Philippe's presentation was a set of
successful results of standardization and interoperability for the user base
over the last ten years, with a roadmap of at least ten more.
The second presentation came from John
Goodenough. In his talk "Design in the 21st Century" he stressed the
importance of software, debug and energy aware design. After having mentioned
in passing that his birthday is on Groundhog Day, looking at the challenges
felt like Groundhog Day too: The requirements from 10 years ago still hold
true, and today feels like a repeat because the complexity of the designs we
are dealing with as an industry simply has grown so much. Design still is about
modeling, integration and verification. Low power design and managing
complexity require new thinking and new tools. EDA will play a critical role in
enabling the industry, but interoperability remains crucial as nobody has all
the EDA and nobody has all the IP. Collaboration between the players was, is
and will remain key!
the first two presentations already had left me with the impression that most
of the interoperability challenges are in the pre-RTL and system-level domains,
the last presentation of the opening session made it crystal clear! Jim Hogan
presented the sequel to his 2003 presentation titled "EDA Interoperability -
The Good, Bad and Ugly," appropriately titled "A Fistful of Dollars." Jim's
presentation from 2003 (while he was at Artisan) was completely focused on back-end
and layout related issues. This time around he focused on the system-level and
The beginning of his presentation also
inspired the title of this blog post. To underline the importance and the
potential widespread reach of standards, he first went all the way back to
Roman times. Equine anatomy - its rear end to be precise - apparently is the
source of the width of the U.S. railway gauge at 56.5." Twice a horse's rear
end made the width of Roman chariots, which got standardized with appropriate
SPQR stamps and then went on to leave their impression all over Europe as shown
here from Jim's presentation. When the first railways in Britain were created,
they used the standard width still coming from Roman "Pax Romana" times, and
eventually they made their way to the U.S.
What can we learn from all this? Well,
first of all Jim argued that standards were intended for military purposes,
from which the specifications were written which lead to the creation of
infrastructure and flows. Communications and commerce followed immediately,
supplanting the original intent. Finally, as the example elegantly showed,
standards may have a life expectancy well beyond their original intent.
the time since 2003, James concluded that in 2003 the focus was much closer to
the transistor, design for manufacturing and masks. Start-up companies rushed
to fill the gap and were acquired. In parallel the industry moved from 180nm to
28nm, while market leaders held on to their 2003 proprietary standards. Also,
some progress was made in system languages, RT-Level and Behavioral IP.
In his view a huge opportunity lies in
standards for SoC Realization. As the example showed, standards can fill gaps
and accelerate commerce. Given that they eliminate barriers to growth, they are
a good investment of time and effort. Jim
used a couple of examples in the SoC Realization domain, especially around
HW/SW co-optimization, high-level synthesis, IP integration and virtual
platforms. In discussing the beneficiaries of standards -- the users -- I found
the stacked triangles he showed especially interesting. James stacked the team
sizes for the various software and hardware development tasks, and summarized
the situation with the picture at the left.
After having heard all three speakers it
is probably fair to conclude that the next set of challenges for standards and
interoperability lies in the pre-RTL and system-level domains. It will be
interesting to see what that means for the different users benefitting from standardization
efforts, and also which standardization committees will be the most influential
ones. Given the mix of hardware and software developers, classic
standardization committees from the software domain - like the Object Modeling
Group OMG - may become much more influential for hardware in 2012 and beyond.
Let's hope that the results will be
achieved in less than ten years and will be as long-lasting as the line from
horses' rear ends to railway gauges!