• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Community Forums
  2. Functional Verification
  3. In what aspects is verification different from design?

Stats

  • Locked Locked
  • Replies 8
  • Subscribers 64
  • Views 15865
  • Members are here 0
This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

In what aspects is verification different from design?

archive
archive over 19 years ago

Hi,
            I will be co-moderating this board together with Stylianos, a formal introduction will come sometime later.
 
            It appears to me that recently there has been a significant interest in designing new methods for verification planning. Classic methods of planning have been known to result in late releases (in some projects multiple months late) and unpredictable quality.
 
            I believe this forum could become a catalyst to understanding the underlying problem we are trying to solve, and to both discuss and shape the solutions that are taking form.
 
            To trigger some discussion, I want to pose a simple question: In what ways is Verification different from Design? (ways which might affect planning)
 
            I believe if we clarify this question, we can better understand why some of the classic methods fail with verification.
 
            Looking forward to your responses,
 
                                                            Akiva


Originally posted in cdnusers.org by Akiva
  • Cancel
Parents
  • archive
    archive over 17 years ago

    ThinkVerification (JL),
    I agree with your assessment. And I'll try to add a little interpretation.

    The less you are tied to somebody else the more linear your progress can be. Thus, when designing a module, or designing a testbench, your work can progress in a linear way. But there is also another more important factor. "Design done", or "Verification environment complete" are subjective measure. In fact one could design almost any design in a matter of hours (i.e. all the functionality is coded) , but we might spend years verifying, synthesizing, and rewriting this poorly written code. Another engineer could spend months on a design, and the verification can go quite smoothly. (Of course there is no guarantee that slow design will be any better than the fast kind.) Thus what you are measuring when you measure "design complete" is somewhat meaningless.

    When I take this notion to the extreme, I conclude that there is one main accurate measure for the functional design & verification progress and that is the progress of the functional coverage. In order for functional coverage to be measured, the functionality needs to be implemented and verification environment need to exercise it. When I measure this, I'm measure the outcome of the designer's and verification engineers effort in terms of how close we are to signing off on the chip. This means that the designer's and verification engineer's progress are tied together (as is in reality).

    Measuring the progress based on objective measures, usually tends to drive people to the shortest path (design reviews, code reviews) while measuring people based on task completion, drives people to complete tasks, and sometimes take unseen shortcuts to get there.

    Ace Verification provides a course called "Coverage Revealed" which illustrates this process to the design and verification teams.


    Akiva (www.aceverification.com)



    Originally posted in cdnusers.org by Akiva
    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
Reply
  • archive
    archive over 17 years ago

    ThinkVerification (JL),
    I agree with your assessment. And I'll try to add a little interpretation.

    The less you are tied to somebody else the more linear your progress can be. Thus, when designing a module, or designing a testbench, your work can progress in a linear way. But there is also another more important factor. "Design done", or "Verification environment complete" are subjective measure. In fact one could design almost any design in a matter of hours (i.e. all the functionality is coded) , but we might spend years verifying, synthesizing, and rewriting this poorly written code. Another engineer could spend months on a design, and the verification can go quite smoothly. (Of course there is no guarantee that slow design will be any better than the fast kind.) Thus what you are measuring when you measure "design complete" is somewhat meaningless.

    When I take this notion to the extreme, I conclude that there is one main accurate measure for the functional design & verification progress and that is the progress of the functional coverage. In order for functional coverage to be measured, the functionality needs to be implemented and verification environment need to exercise it. When I measure this, I'm measure the outcome of the designer's and verification engineers effort in terms of how close we are to signing off on the chip. This means that the designer's and verification engineer's progress are tied together (as is in reality).

    Measuring the progress based on objective measures, usually tends to drive people to the shortest path (design reviews, code reviews) while measuring people based on task completion, drives people to complete tasks, and sometimes take unseen shortcuts to get there.

    Ace Verification provides a course called "Coverage Revealed" which illustrates this process to the design and verification teams.


    Akiva (www.aceverification.com)



    Originally posted in cdnusers.org by Akiva
    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
Children
No Data

Community Guidelines

The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.

© 2025 Cadence Design Systems, Inc. All Rights Reserved.

  • Terms of Use
  • Privacy
  • Cookie Policy
  • US Trademarks
  • Do Not Sell or Share My Personal Information