Cadence® system design and verification solutions, integrated under our System Development Suite, provide the simulation, acceleration, emulation, and management capabilities.
System Development Suite Related Products A-Z
Cadence® digital design and signoff solutions provide a fast path to design closure and better predictability, helping you meet your power, performance, and area (PPA) targets.
Full-Flow Digital Solution Related Products A-Z
Cadence® custom, analog, and RF design solutions can help you save time by automating many routine tasks, from block-level and mixed-signal simulation to routing and library characterization.
Overview Related Products A-Z
Driving efficiency and accuracy in advanced packaging, system planning, and multi-fabric interoperability, Cadence® package implementation products deliver the automation and accuracy.
Cadence® PCB design solutions enable shorter, more predictable design cycles with greater integration of component design and system-level simulation for a constraint-driven flow.
An open IP platform for you to customize your app-driven SoC design.
Comprehensive solutions and methodologies.
Helping you meet your broader business goals.
A global customer support infrastructure with around-the-clock help.
24/7 Support - Cadence Online Support
Locate the latest software updates, service request, technical documentation, solutions and more in your personalized environment.
Cadence offers various software services for download. This page describes our offerings, including the Allegro FREE Physical Viewer.
Get the most out of your investment in Cadence technologies through a wide range of training offerings.
This course combines our Allegro PCB Editor Basic Techniques, followed by Allegro PCB Editor Intermediate Techniques.
Virtuoso Analog Design Environment Verifier 16.7
Learn learn to perform requirements-driven analog verification using the Virtuoso ADE Verifier tool.
Exchange ideas, news, technical information, and best practices.
The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information.
It's not all about the technlogy. Here we exchange ideas on the Cadence Academic Network and other subjects of general interest.
Cadence is a leading provider of system design tools, software, IP, and services.
In my last blog entry, I implored Accellera to release UVM 1.0 quickly, standardizing OVM 2.1 as is, with full backwards compatibility and without trying to cram overlapping functionaity from VMM into the base. Then they can add new functionality on top of this base, taking good ideas from OVM World Contributions, VMM, in-house methodologies developed by the member companies, etc.
It occurs to me that Accellera faced a similar situation during the early days of SystemVerilog.They chose Verilog-2001 as the base for SystemVerilog, maintained backward compatibility, and did not try to stuff overlapping VHDL constructs into the base. Then they added lots of new functionality on top of the base, taking good ideas from VHDL, Superlog, Vera, e, C/C++/SystemC, etc.
Just imagine what would have happened if Accellera had mashed Verilog and VHDL constructs together into the SystemVerilog base, or even worse if they actually changed some of the syntax so that SystemVerilog was not backwards compatible with Verilog. The resulting language would have been a mess, and countless thousands of Verilog users would have had to recode.
I sincerely hope that there are no thoughts within Accellera of mashing VMM features into the UVM's OVM base or of doing anything to break OVM backwards compatibility. Accellera must ensure that UVM 1.0 = OVM 2.1 and then put its energy into enhancements for future UVM releases. Anything else would be a nightmare for the countless thousands of OVM users.
The truth is out there...sometimes it's in a blog.
Sorry, but I can't agree with your statement that "standards are now being driven by the EDA companies, not the users" at least in the case under discussion. The UVM is being standardized by an Accellera Technical Subcommittee (TSC) that has two "user" chairs and more members from "user" companies than EDA companies. Of course we EDA vendors care deeply about this standard and are working both within the TSC and outside of it (my blogs) to argue for what we believe is the right outcome. But I don't buy the argument that we are in the driver's seat; the users will determine the UVM.
I have no idea what you mean by "supporting them [customers] to have seats on the standards committees." Users certainly do not need the permission or support of EDA vendors to participate in standards bodies. Many such groups allow individual participation, and those that require company membership welcome "user" companies just as openly as EDA companies.
You sound as if you have some strong opinions and provocative ideas for evolution of design and verification languages. I encourage you to participate in Accellera, IEEE, OSCI, or other organizations dealing with standards in this field. It an be intellectually stimulating, personally rewarding, and even fun!
Tom A. (former standards participant as a user in Accellera, IEEE, VSIA, PCI SIG, USB IF, 1394 TA, etc.)
The problem that I observe is that the standards are now being driven by the EDA companies, not the users, so the drive is by marketing Hype and Profit rather than the advancement of the standard, to provide the end clients with a Right first time solution.
Techniques which have been known about, by some of us, for over 15 yrs have still not arrived in the standards.
Lets start with fundamentals, engineers design in bits and fields, no available system EDA tools supports bit slicing and field manipulation to the level required at system design, never mind true transaction interface definition. The definitions all seem to be done to suite the software engineer and they are byte orientated by definition. We do not need software design tools, what we are attempting to support is Hardware definition, at a system level, in a world where time matters, in more than one sense.
Lets make for more use of strongly typed languages to save introduction of errors in the first place. This include the management of data file used for testing, which should support enumerated types, as field values, for example.
Once we arrive at the actual system testing phase, UVM is still well behind the curve for a methodology, if I had only had what UVM will offer I'm sure I would not have achieved another right first time system device. System verification needs to concentrate more on not putting errors into the design in the first place rather than testing to remove them.
My list goes on.
Maybe its time to actually start listening to your customers, or even better supporting them to have seats on the standards committees, rather than second guessing my requirements.
I'll ignore the puerile flamebaiting and simply repeat my statement that successful standards have practical industry aspects as well as technical aspects. I'm not denigrating the technical aspects at all; I used to code testbenches for a living too. However, I contend that no matter how good the UVM is technically, its success is more likely if the standard is compatible with the huge OVM user base (and, through Accellera interoperability, VMM users as well). If others disagree, I'd love to hear your arguments.
We've seen how Cadence marketing has been correct in the past..
just google Mitch Weaver, Cadence, System Verilog (if the embedded links dont work)
I also remember the System C Verification Library
Also Vera and E
So, we've risen above language wars and now are fighting about methodology...
I'd rather hear from the Technical folks blogging/writing about OVM and VMM
the JL Grays, Mark Glassers, Janick Bergerons of the world that code testbenches for a living. The 'technical' postings on Dulos, JL's Cool Verification site, VMM Central and OVM World carry a lot more weight IMHO.
Users are still having to recode as we speak. From OVM 1.0 to 2.0 (which was a total change.. just pick up Glassers first book, which is on OVM 1.0 and way out of date now)
There are still `defines in the OVM code base that is posted to OVM World, which allow it to run correctly in Mentor and Cadences simulators.
I agree with Steve in that sound technical outcomes are based on the users of the technology, and their input should weigh into the decisions going on within the VIP-TSC
Blanket adoption of OVM 2.1 is not the answer, the standards process takes time.
Steve, I'm not trying to tell Accellera how to do its job, such as which architect(s) should lead their technical work. But no standard is purely technical in nature; its success often hinges on other factors as well. I maintain that creating a new UVM CBCL that requires every OVM and VMM user to recode is much less likely to succeed. Backwards compatibility with OVM 2.1 ensures that OVM users don't have to recode and that VMM users can also preserve their investment via the interoperability solution already defined by the TSC. Call it lobbying if you want, but I'm suggesting how to make the UVM successful and not just one of those paper standards that no one actually uses.
hi Tom: Thanks. BTW my comments were to UVM outcome, not OVM.
I guess its premature for me to comment on UVM as the VIP-TSC is not yet finished their work. I was commenting about CBCL (common base class) definition which is progress, but not a complete methodology.
Lastly the best standards have been guided by a strong technical architect such as Linus->Linux, Guido->Python, so who is the chief guiding UVM ?
Wouldn't it be better to enable an architect within the TSC and not have too much outside lobbying on any positions ? More likely to have a sound technical outcome that way.
To be sure, there is a lot that can be added beyond OVM 2.1, as the popularity of the OVM World Community Contributions area (the subject of my next blog) shows. But there are countless thousands of users using OVM very successfully today so I don't buy the implication that it's "half constructed." As for interoperability, every OVM kit released on OVM World is fully tested on both Incisive and Questa, and we hear from our customers that VCS now runs the standard OVM kit as well. So one implementation, fully interoperable.
First of all, I totally agree with no three headed camels.
It would be nice to hear one of the acknowledged VMM experts, or a VMM expert user weigh in. In addition I wonder about the worthiness of an intermediate milestone such as base classes and why not at least a complete subset which forms a usable VM. It makes me think of the half constructed buildings in Sunnyvale. I wish Sunnyvale City Government insisted on ensuring funding for a complete usable stage.
On a different note, how interoperable are OVM implementations across Questa and Incisive ? Is it designed to be a robust standard without ambiguity or implementation dependent incompatiblities ? any plans for an interoperability suite ?
I agree; no nightmares.