• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Community Forums
  2. Custom IC Design
  3. How to plot a histogram of a relative error in a Monte Carlo...

Stats

  • Replies 3
  • Subscribers 125
  • Views 627
  • Members are here 0

How to plot a histogram of a relative error in a Monte Carlo analysis?

DomiHammerfall
DomiHammerfall 1 month ago

Dear community

I would like to plot a histogram of a certain measure (the peak value of a transient signal in my case) after a Monte Carlo analysis, but I am interested in the relative change of said measure. To do so, I need to pass a reference value taken from a nominal simulation in order to calculate something like (peak_value-reference_value)/reference_value*100. Alternatively, one could assume that the mean from the Monte Carlo sampling is the same as the nominal value and thus simply take the mean from all MC iterations as reference to calculate the relative error.

This question was already asked here and there, and there are also solutions in the manual, but I am unable to make any of these solutions work.

Approach 1: Using calcVal() together with a Run Plan

My understanding is that different tests can only have different analyses, not different run types (single vs. Monte Carlo). So the way to go is a Run Plan. In the Virtuoso ADE Assembler User Guide, there is a section called Using Results of One Run in Another that describes precisely that case. There it is described that a global variable needs to be created that fetches the output of a previous run by using the ?run argument of calcVal. That variable is then referenced in the output of the second run with the usual VAR() command. The manual doesn't explicitly say that it is also necessary to make another test with another name, but I assume that this is needed because calcVal has a required test name field, and using the same test name in both runs might create a cyclic dependency. So in summary, I perform the following steps.

1. I create a first run named nominal_run where I do a single run and calculate the reference value. Suppose the corresponding output is called nominal_amplitude
2. I make a test copy and deselect the original test. Then I create a second run named MC_run with a Monte Carlo sampling and a global variable that looks like this: calcVal("nominal_amplitude" "myTest" ?run "nominal_run")

When I try to run the Run Plan I get the following error:

I double checked that neither a variable nor a ADE output exists in the first run that uses the calcVal function to reference itself, but I still get this error and this is also the procedure described in the manual, so I am not sure where my mistake lies.

Approach 2: Using average() together with measureAcross()

My understanding is that ADE treats the MC iterations as any other parameter that can be swept. So I should be able to query it with the measureAcross function. If I add the following three lines...

...then the output relative_error evaluated to 0 for all iterations, as shown below:

What is the correct way of doing that?

  • Sign in to reply
  • Cancel
Parents
  • JankoK
    JankoK 1 month ago

    Hi Domi,

    I can make both approaches work (in the latest version, IC23.1 ISR15).

    Approach 1: You need to uncheck calcVal() variable from your Active Setup and delete it in your nominal_run. This is to make sure that there is no dependency in the first run. You also need to add ?getFirstSweepPoint t to you calcVal() command. Otherwise, only the first MC point will get the value properly passed. 

    Approach 2: You need to update that relative_error expression so it is measureAcross any too. Similar to your average_amplitude expression. This will create waveform of you relative error across mcparamset. Once you plot it in ViVA you can select the trace -> Measurements -> Histogram, set the number of bins to something like 10 and click plot.

    Hope this helps!

    /Janko  

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
  • DomiHammerfall
    DomiHammerfall 1 month ago in reply to JankoK

    Approach 1: Can confirm! Unchecking the global variable with the calcVal expression from the active setup does the trick. Also, it is not necessary to make two tests with distinctive names.

    Approach 2: Can confirm! 

    Thank you very much for your help, much appreciated!

    Bonus information for the interested reader: If in addition to the MC sampling another parameter is swept, then ?getFirstSweepPoint in calcVal needs to be replaced with ?matchParams "all" or ?matchParams list("yourVariableName") argument.

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
Reply
  • DomiHammerfall
    DomiHammerfall 1 month ago in reply to JankoK

    Approach 1: Can confirm! Unchecking the global variable with the calcVal expression from the active setup does the trick. Also, it is not necessary to make two tests with distinctive names.

    Approach 2: Can confirm! 

    Thank you very much for your help, much appreciated!

    Bonus information for the interested reader: If in addition to the MC sampling another parameter is swept, then ?getFirstSweepPoint in calcVal needs to be replaced with ?matchParams "all" or ?matchParams list("yourVariableName") argument.

    • Cancel
    • Vote Up 0 Vote Down
    • Sign in to reply
    • Cancel
Children
No Data

Community Guidelines

The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.

© 2025 Cadence Design Systems, Inc. All Rights Reserved.

  • Terms of Use
  • Privacy
  • Cookie Policy
  • US Trademarks
  • Do Not Sell or Share My Personal Information