• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Community Forums
  2. Functional Verification
  3. Different result with vmanager and standalone simulation...

Stats

  • Locked Locked
  • Replies 7
  • Subscribers 67
  • Views 12665
  • Members are here 0
This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Different result with vmanager and standalone simulation, with same seed

djdjuric
djdjuric over 3 years ago

Hi,

I am running standalone simulations with xcelium, and I have siimulation passing. When I run it in regression, simulation fails due to errors in the checkers, that I have implemented. Checkers are simple, only checking voltage levels at point of time, or calculating slew.

When I use the same seed from regression and run it standalone, simulation passes. Results in SimVision, when, observed look ok.

Types of errors are for example too much difference between expected and measured voltages.

I did check vsif file several times, and I don't see any difference between vsif and standalone setup. Only difference is seed, but as I said, even with same seed, I get pass in standalone.

Sorry, if my terminology is off, hope you will understand the issue.

Did anyone, have this kind of issue, adn can you point me what to check, or where to look? Thank you.

Best regards,

Djordje

  • Cancel
Parents
  • StephenH
    StephenH over 3 years ago

    There are lots of possible ways your results might end up being different. Are you using SV randomize() in your testbench? If so, you could try adding "-xceligen heartbeat" to the xrun command in both setups and diff the logs to see if the seed sequence changes somewhere during the test. There are other solver options that can help, if heartbeat makes it look like a seeding issue, have a look at this article: How to debug Random Instability across the different runs of same test

    If you have waveforms for both tests, you could use the "simcompare" tool to diff the waveforms to identify the earliest point in time where the waves differ - it's easiest to do this inside SimVision using the "SimCompare Manager" under the "Tools" menu.

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
Reply
  • StephenH
    StephenH over 3 years ago

    There are lots of possible ways your results might end up being different. Are you using SV randomize() in your testbench? If so, you could try adding "-xceligen heartbeat" to the xrun command in both setups and diff the logs to see if the seed sequence changes somewhere during the test. There are other solver options that can help, if heartbeat makes it look like a seeding issue, have a look at this article: How to debug Random Instability across the different runs of same test

    If you have waveforms for both tests, you could use the "simcompare" tool to diff the waveforms to identify the earliest point in time where the waves differ - it's easiest to do this inside SimVision using the "SimCompare Manager" under the "Tools" menu.

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
Children
  • djdjuric
    djdjuric over 3 years ago in reply to StephenH

    Hi Stephen,

    first of all, I want to thank you for fast response.

    I am not sure if I am using randomize.I know that things that I drive from the test and testbench should be fixed. I know that I get different seed each time I run regression. But when I find that seed of a log in vmanager, and use it in standalone sim, I get that stand alone sim to pass, and it is failing in vmanager.

    Maybe I need to mention that other tests are passing for me in regression, and they are passing each time I run them. This issue is only on couple of tests. This is verification using Specman. I am new to this, so I might miss some terminology.

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • StephenH
    StephenH over 3 years ago in reply to djdjuric

    What I was trying to say is that if you compare the logs for the same test+seed combination for vManager and stand-alone tests, you should be able to home in on the reason for the difference in result.

    If you're using Specman instead of UVM-SV then you need to check the random stability / seed sequence a different way, but starting with a simple log diff would be best.

    You should even take care to check that the exact same set of "e" files are loaded, etc.

    • Cancel
    • Vote Up +1 Vote Down
    • Cancel
  • djdjuric
    djdjuric over 3 years ago in reply to StephenH

    Ok, I will do so and get back to you, thank you.

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • djdjuric
    djdjuric over 3 years ago in reply to StephenH

    Hi All,

    first I want to say thank you to StephenH. I found the issue with observing the log and looking into actual waveform of the sim. My problem was:
    1. I was using generate statement in Instance generation, for different setups.

    2. Generate function is elaborated and done only once in the begging. And it will take values from the initial, default setup in the model, as verification was using precompile script to speed up regressions, and everything after that was disregarded, in terms of generate setup.

    Luckily, I managed to see the issue, after a lot of careful observation of waves and logs.

    Conclusion: Generate function + precompile script was the issue.

    Thank you StephenH, I really appreciate the time and advice.

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • StephenH
    StephenH over 3 years ago in reply to djdjuric

    Great, I'm glad you found the source of the problem and were able to fix it! Thanks for the feedback. :)

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel

Community Guidelines

The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.

© 2025 Cadence Design Systems, Inc. All Rights Reserved.

  • Terms of Use
  • Privacy
  • Cookie Policy
  • US Trademarks
  • Do Not Sell or Share My Personal Information