Cadence® system design and verification solutions, integrated under our System Development Suite, provide the simulation, acceleration, emulation, and management capabilities.
System Development Suite Related Products A-Z
Cadence® digital design and signoff solutions provide a fast path to design closure and better predictability, helping you meet your power, performance, and area (PPA) targets.
Full-Flow Digital Solution Related Products A-Z
Cadence® custom, analog, and RF design solutions can help you save time by automating many routine tasks, from block-level and mixed-signal simulation to routing and library characterization.
Overview Related Products A-Z
Driving efficiency and accuracy in advanced packaging, system planning, and multi-fabric interoperability, Cadence® package implementation products deliver the automation and accuracy.
Cadence® PCB design solutions enable shorter, more predictable design cycles with greater integration of component design and system-level simulation for a constraint-driven flow.
An open IP platform for you to customize your app-driven SoC design.
Comprehensive solutions and methodologies.
Helping you meet your broader business goals.
A global customer support infrastructure with around-the-clock help.
24/7 Support - Cadence Online Support
Locate the latest software updates, service request, technical documentation, solutions and more in your personalized environment.
Cadence offers various software services for download. This page describes our offerings, including the Allegro FREE Physical Viewer.
Get the most out of your investment in Cadence technologies through a wide range of training offerings.
This course combines our Allegro PCB Editor Basic Techniques, followed by Allegro PCB Editor Intermediate Techniques.
Virtuoso Analog Design Environment Verifier 16.7
Learn learn to perform requirements-driven analog verification using the Virtuoso ADE Verifier tool.
Exchange ideas, news, technical information, and best practices.
The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information.
It's not all about the technlogy. Here we exchange ideas on the Cadence Academic Network and other subjects of general interest.
Cadence is a leading provider of system design tools, software, IP, and services.
Get email delivery of the Cadence blog featured here
One of the things I learned when Verisity purchased Axis was the difference in mindset between verification using emulation vs. simulation. Emulators generally cost more and companies have less of them compared to logic simulators which cost less and companies have more of them. Each technology has its pros and cons that I don't want to get into today, but one of the things I learned from my Verisity friends was that verification could be improved by running massively parallel simulation. Sure, most projects have a suite of thousands of tests that are run every day to make sure nothing is broken when changes are made, but this is not what they were talking about. Running more tests only makes sense if there is a way to create different results and there is a way to gather all the results, identify the bugs, and measure what really happened during all of the simulations. This improved verification was due to actually finding new bugs that were triggered as a result of a smarter verification environment. Verification tools like Specman and Enterprise Manager made all this possible. One of the results of the transition to constrained random stimulus generation, checking, and functional coverage has been a digital verification flow that typically uses a farm of machines to run many simulations in parallel, not just to cut down the time needed to run the same tests over and over, but to automatically create new tests that hit new bugs. Even small companies run hundreds of simulations at a time and larger projects may run thousands. Recently, I heard one company is running hundreds of thousands of simulations in parallel. This is a very high license to engineer ratio.
Analog design and verification is something I know almost nothing about, bit it seems to include simulation and a lot of inspection of waveforms. This has lead to a flow where each engineer sits at the screen, simulates, inspects, and iterates until he is happy with the design. Under this flow the license to engineer ratio is 1.
Recently, I was talking to someone who has been around Virtual Platforms for software development for a long time. He was lamenting that the Virtual Platform has not progressed from the analog model to to the digital model as described above. His vision for Virtual Platforms has always been massive parallel simulation. Features like check pointing, dynamic scripting, reverse execution, ease of migrating simulations from one machine to another should all support this vision. My conclusion is that both analog and embedded software would like to move to the digital verification model, but are not there yet. In the embedded software space the same type of tools that enabled the transition in digital verification including constrained random stimulus generation, checking, code coverage, and functional coverage are needed.
There is a great opportunity for Virtual Platforms to make a significant contribution to embedded software, just as the logic simulator did for digital design, but without verification methodology the best the Virtual Platform can do is the one per engineer model where every software engineer sits by the machine, runs and debugs code, and manually inspects the results until happiness is achieved. I'm sure a lot of original Specman sales people can tell stories about how they used to visit engineers who would say, "I have a Verilog simulator and a good waveform debugging tool, what else could I possibly need?". It appears the Virtual Platform is playing the role of logic simulator for embedded software engineers, but we know from history that having a simulator is not enough. It's the foundation on which everything else is built (as I describe in the Verification Hierarchy of Needs) but an entire verification ecosystem around the Virtual Platform is needed for it to reach its full potential.
Why am I bringing this up, because I work for a tool vendor that needs to sell more licenses? Not really. Sure more licenses are good, but customers aren't really interested in the licenses, they are interested in the results. It's intriguing that both the analog world and the embedded software world seem to be pushing in the same direction. While reviewing the similarities I came across a paper titled "Coverage-driven verification for mixed-signal systems". Some of the challenges relate to checking that an analog waveform or a C function is doing what it should. It's probably not as easy as checking digital signals for 1's and 0's or busses for hex values, but I'm confident these challenges can be overcome and embedded software on the Virtual Platform will adopt the digital verification flow. I also hope the same thing happens to analog design and verification, but for now I'll keep focusing on the Virtual Platform and leave the analog part to my more than competent Cadence colleagues.