• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Community Forums
  2. Functional Verification
  3. ncsim out-of-memory when enabling code coverage

Stats

  • Locked Locked
  • Replies 5
  • Subscribers 64
  • Views 15835
  • Members are here 0
This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

ncsim out-of-memory when enabling code coverage

archive
archive over 18 years ago

Hi everybody,
this is my first post in the forum and I hope I'm writing in the right section.

My group is using a specman-based verification environment to test a small system composed of 2 IPs. Additionally, we use nc-coverage to collect data on block, expression and FSM coverage.

At this moment our most intensive runs are based on three different tests, each executed 80 times with a random seed. This scenario will be further complicated in the near future.

Multiple test runs are controlled with the ncsim Tcl shell, using some custom procedures. Note that only one instance of the simulator is run. Basically, at each iteration, we perform the following tasks:

foreach test $test_list {
- specman: load a specific test
LOOP on test runs
- ncsim: reset the simulator
- specman: reload environment
- specman: test -seed=random
- ncsim: setup coverage:
    * coverage -setup -dut :
        * coverage -setup -testname [format "%s_run%s" $test_name $tag]
        * coverage -setup -workdir [file join $outdir nc_cov_data]
        * coverage -code
        * coverage -fsm
- ncsim: run
- ncsim: dump coverage data
    * coverage -code -dump
    * coverage -fsm -dump
- update counters for next iteration
END LOOP
} # close loop on tests

Simulations run fine if code coverage is disabled (not instrumented during elaboration), but goes out of memory when enabling code coverage. Memory reduction can also be achieved by reducing the sample rate of code coverage (i.e. every 10 test runs, instead of all test runs)

In my understanding, all simulation runs are independent and simulator should flush data between each reset/reload. However, it seems that memory occupation increases after each test run until the max_size limit is reached.

I think that a possible solution could be moving the loop from the simulator shell to an external system shell, in this way each test run would require starting and shutting down ncsim. This would also require re-arranging our script system, which is something I'd like to avoid as we're planning to move to Enterprise Manager in january.

Any suggestion for a fast-fix?

I'm using IUS5.5 and Specman 5.0.3 eVCs are compiled in a shared library and linked to ncsim_specman with the SPECMAN_DLIB variable. IPs are developed in VHDL.

Thank you very much,
nico


Originally posted in cdnusers.org by nko
  • Cancel
  • archive
    archive over 18 years ago

    nko,

    I am moving this thread to the Functional Verification forum, "e" where the Specman experts hang out.

    Administrator


    Originally posted in cdnusers.org by Administrator
    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • archive
    archive over 18 years ago

    I have seen something similar when instrumenting FSM's in older versions of IUS. Can you omit coverage -fsm and see what happens to your memory blow out?


    Originally posted in cdnusers.org by douge
    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • archive
    archive over 18 years ago

    Hi Niko,

    The reason why NCSIM runs out of memory is because you have enabled FSM coverage in the run. You can do two things to compensate on this

    1. Disable FSM coverage. Run the complete regression. Analyse the block, expression and functional coverage numbers at the end of the run. If the numbers are satisfactory then you can sign off your verification just with that.

    2. If you still need state coverage. You can try modelling the states in e and try doing a transition coverage on the states in e. That should give you a good estimate of the state space covered


    Originally posted in cdnusers.org by krish.p@samsung.com
    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • archive
    archive over 18 years ago

    Hi guys, thank you very much for the suggestion.

    Unfortunately it seems that my memory blow-out is not related to FSM coverage, as disabling it does not remove the problem.

    I've tried various combination of code coverage and it seems that problems are related to expression coverage:
    - fsm only: OK
    - block only: OK
    - expression only: fail
    - expr+block: fail
    - block + fsm: OK

    I will try to collect data only on one IP at a time.
    Any other idea is welcome.
    Thank you again for your help.

    nico


    Originally posted in cdnusers.org by nko
    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • archive
    archive over 18 years ago

    Hi guys.
    I found the cause all of my troubles! One of the eVC instantiated in the environment was extending setup() in the following way:

    setup() is also {
    set_config(memory, gc_threshold, 300M);
    set_config(memory, gc_increment, 50M);
    set_config(memory,max_size,1000M);
    set_config(memory,absolute_max_size,1100M);
    }

    It seems that garbage collection threshold and increment were too high.
    Do you think this is a reasonable explanation of my problem?
    I didn't notice this settings until I had to modify the eVC configuration for other reasons. I was pretty sure specman was running with default memory settings.

    Lesson learned :-)


    Originally posted in cdnusers.org by nko
    • Cancel
    • Vote Up 0 Vote Down
    • Cancel

Community Guidelines

The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.

© 2025 Cadence Design Systems, Inc. All Rights Reserved.

  • Terms of Use
  • Privacy
  • Cookie Policy
  • US Trademarks
  • Do Not Sell or Share My Personal Information