Cadence® system design and verification solutions, integrated under our System Development Suite, provide the simulation, acceleration, emulation, and management capabilities.
System Development Suite Related Products A-Z
Cadence® digital design and signoff solutions provide a fast path to design closure and better predictability, helping you meet your power, performance, and area (PPA) targets.
Full-Flow Digital Solution Related Products A-Z
Cadence® custom, analog, and RF design solutions can help you save time by automating many routine tasks, from block-level and mixed-signal simulation to routing and library characterization.
Overview Related Products A-Z
Driving efficiency and accuracy in advanced packaging, system planning, and multi-fabric interoperability, Cadence® package implementation products deliver the automation and accuracy.
Cadence® PCB design solutions enable shorter, more predictable design cycles with greater integration of component design and system-level simulation for a constraint-driven flow.
An open IP platform for you to customize your app-driven SoC design.
Comprehensive solutions and methodologies.
Helping you meet your broader business goals.
A global customer support infrastructure with around-the-clock help.
24/7 Support - Cadence Online Support
Locate the latest software updates, service request, technical documentation, solutions and more in your personalized environment.
Cadence offers various software services for download. This page describes our offerings, including the Allegro FREE Physical Viewer.
Get the most out of your investment in Cadence technologies through a wide range of training offerings.
This course combines our Allegro PCB Editor Basic Techniques, followed by Allegro PCB Editor Intermediate Techniques.
Virtuoso Analog Design Environment Verifier 16.7
Learn learn to perform requirements-driven analog verification using the Virtuoso ADE Verifier tool.
Exchange ideas, news, technical information, and best practices.
The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information.
It's not all about the technlogy. Here we exchange ideas on the Cadence Academic Network and other subjects of general interest.
Cadence is a leading provider of system design tools, software, IP, and services.
I will be co-moderating this board together with Stylianos, a formal introduction will come sometime later.
It appears to me that recently there has been a significant interest in designing new methods for verification planning. Classic methods of planning have been known to result in late releases (in some projects multiple months late) and unpredictable quality.
I believe this forum could become a catalyst to understanding the underlying problem we are trying to solve, and to both discuss and shape the solutions that are taking form.
To trigger some discussion, I want to pose a simple question: In what ways is Verification different from Design? (ways which might affect planning)
I believe if we clarify this question, we can better understand why some of the classic methods fail with verification.
Looking forward to your responses,
Akiva asked: "I want to pose a simple question: In what ways is Verification different from Design? (ways which might affect planning)"I'd like to toss out a few ideas :Although both Design and Verification engineers start with a functional spec and proceed towards a common goal of releasing as functionally correct a design as possible on schedule, the activities and focus of each differs considerably.Design engineers design and implement the design given constraints of time, resources, process technology etc. They need to be concerned with the application environment to some degree but are primarily concerned with the world inside the design.Verification engineers design and implement a model of the application environment in which the design is intended to operate, within constraints of time, resources, capabilities of verification methodology and tools etc. They need to be aware of implementation aspects of the design but are primarily concerned with the world around the design. For example, in the case of a digital camera design the application environment would include a model of the user (sets modes, takes pictures etc.), other entities that connect to the camera (printers, PC, memory card, battery etc.), the infinite set of images that the camera is required to capture and so on. In addition, the verifier must design and implement a set of checkers to verify that the design responds correctly and some means to measure how well each feature was exercised (aka coverage). With that background and considering planning to be the process in which a list of features to verify is built, the approach (formal/simulation/acceleration etc.) to verify features is prescribed and the environment(s) are specified, in the planning process both designers and verifiers are required to work with the set of design features, but the verifier is concerned with using the feature descriptions to determine the verification approach, specify the requirements for the verification environments and devise coverage measurements to ensure that each feature is adequately exercised.An additional thought relates to tracking the progress of the development process. Since the % of specified features that have been verified can be considered a reasonable measure of progress, the burden of tracking progress of the development process often falls on the verification engineer. This requires the planning process to be approached in a way that the information captured can be used to provide a credible measure of progress that addresses both completeness (have all features been implemented?) and correctness (have they been implemented correctly?).I hope these ideas help move the discussion forward and I look forward to participating further.Regards,Dean
Verification requires engineers to worry about two big areas:1) Verify implementation meets specification - usually normal mode behavior is specified - This task is complex given engineers take a fairly narrow sample of the design space to verify what they perceive are representative scenarios that encompass the entire design space.2) Everything that falls out of what is specified, error scenarios and their temporal aspects, are handled in a graceful way without bringing down the system - This can be an order of magnitude more complex task as the permutations become infinite.While designers are wrestling with meeting timing budgets, reducing power and area to push integration limits, verifyers are dealing with modeling the world around the design to verify it as thoroughly as possible, often with same or fewer human resources. This leaves verifyers no choice but to innovate and automate or risk product quality.A key factor at play is reuse. Maturing IP reuse cultures are making the verifyers job appear more time consuming as a percentage of total "design time". This usually means more pressure on the verification team to get the job done faster. Sophisticated verification teams are addressing this by investing in reusable Verification IP and verification plan reuse to keep pace with front-end design. This allows them to devote as much time to defining key scenarios/interactions and refining the verification plan vs. spending months of time developing verification IP for a project.
Good responses. I'd like to add that my fist observation is that design is "bounded" while verification is "unbounded".To explain: When a designer has implemented the logic defined in the specification, he is "done" designing. He can improve the code, add comments etc. But aside from bug fixes (which can be considered part of verification) his job is done. In verification, there are always more things which could be done.. The space in infinite and the methods are multiple so you can always do more verification. Have you defined all possible functional coverage? Have you added in all the assertions possible? Have you achieved all the functional coverage? Have you achieved all of the structural/code coverage? Have you written all the performance tests? Have you verified it with formal verification? Have you run it with other blocks? At the system level? on an FPGA? With the production software? With other components? Etc. The amazing thing is that you can continually find additional bugs (for a reasonably sized design). There is a point of diminishing returns, meaning the bugs found would never manifest, never be noticed, or never cared about at a level above. Since this forum will discuss verification planning, I think that one of the most important aspects in verification planning is "bounding" the verification process at the onset. This means that we should decide what are our absolute requirements (vis-à-vis coverage and checking) and plan based on those requirements. The earlier this is done in the process, and the more open this process is, the better the verification plan will be. Akiva
Maybe some core differences are also cultural?If a project , the designer did a great job.If a projects , the verifier did a lousy job.Why then? Well, since we all [b] want [/b] the design to succeed, verifiers are really going up against team spirit, aren't they? When a block is "done", they are the ones that make us go back to the drawing board and do it again. They are the ones that make us give management bad news when the week before we reported things were going great. And they are the ones that will never give us a straight answer! (does it work, are you done, when will you be done, what more do you need in order to be sure, etc...) ;-)At the end of the day, all too often verification engineers are the "bad guys". So maybe planning can be an effective way not only to predict, organize and manage the project, but also to help prevent those poor verification guys from getting all that bad vibe :-)PS: Feel free to substitute the <..> fields for your favorite metric of success...
[ Previous post ran into formatting issues... ]Maybe some core differences are also cultural?If a project succeeds, the designer did a great job.If a projects fails, the verifier did a lousy job.Why then? Well, since we all want the design to succeed, verifiers are really going up against team spirit, aren't they? When a block is "done", they are the ones that make us go back to the drawing board and do it again. They are the ones that make us give management bad news when the week before we reported things were going great. And they are the ones that will never give us a straight answer! (does it work, are you done, when will you be done, what more do you need in order to be sure, etc...) ;-)At the end of the day, all too often verification engineers are the "bad guys". So maybe planning can be an effective way not only to predict, organize and manage the project, but also to help prevent those poor verification guys from getting all that bad vibe :-)PS: Feel free to substitute the <..> fields for your favorite metric of success...
The key question is "What is the definition of success ?", since ASIC
design first started designers (and many managers) define success as
"Tapeout on time", Verification Engineers define success as "working
silicon". and from the Company's perspective the market may
ultimately define success. Given exhaustive verification,
even at the module level is becoming less and less likely, the
definition of success becomes blurred into "to find and fix all the
critical bugs that can be reasonably found given the resources
available in a timely manner" ... not exactly black and white. Proper
planning is the only way to manage a project with such a complex goal
with any certainty of achieving a successful outcome.
Just throwing out a couple of quick ideas.... Design projects have a rather linear progression. Verification projects have (roughly) 2 phases -1) Env. writing, which is quite linear and similar to Design in some ways 2) Testing and debug - which could be very hard to predict in terms of progress and convergence to project milestonesAlso - I think design projects are much more "module" oriented, while Verification projects sometimes have cross-module tasks or tasks that concern the DUT or the ENV as a whole.
ThinkVerification (JL), I agree with your assessment. And I'll try to add a little interpretation. The less you are tied to somebody else the more linear your progress can be. Thus, when designing a module, or designing a testbench, your work can progress in a linear way. But there is also another more important factor. "Design done", or "Verification environment complete" are subjective measure. In fact one could design almost any design in a matter of hours (i.e. all the functionality is coded) , but we might spend years verifying, synthesizing, and rewriting this poorly written code. Another engineer could spend months on a design, and the verification can go quite smoothly. (Of course there is no guarantee that slow design will be any better than the fast kind.) Thus what you are measuring when you measure "design complete" is somewhat meaningless. When I take this notion to the extreme, I conclude that there is one main accurate measure for the functional design & verification progress and that is the progress of the functional coverage. In order for functional coverage to be measured, the functionality needs to be implemented and verification environment need to exercise it. When I measure this, I'm measure the outcome of the designer's and verification engineers effort in terms of how close we are to signing off on the chip. This means that the designer's and verification engineer's progress are tied together (as is in reality). Measuring the progress based on objective measures, usually tends to drive people to the shortest path (design reviews, code reviews) while measuring people based on task completion, drives people to complete tasks, and sometimes take unseen shortcuts to get there. Ace Verification provides a course called "Coverage Revealed" which illustrates this process to the design and verification teams. Akiva (www.aceverification.com)