• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Community Forums
  2. Custom IC Design
  3. Delay Degradation vs Glitch Peak Criteria for Constraint...

Stats

  • Locked Locked
  • Replies 5
  • Subscribers 125
  • Views 14917
  • Members are here 0
This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Delay Degradation vs Glitch Peak Criteria for Constraint Measurement in Cadence Liberate

anurans
anurans over 5 years ago

Hi,

This question is related to the constraint measurement criteria used by the Liberate inside view. I am trying to characterize a specific D flip-flop for low voltage operation (0.6V) using Cadence Liberate (V16). 

When the "define_arcs" are not explicitly specified in the settings for the circuit (but the input/outputs are indeed correct in define_cell), the inside view seems to probe an internal node (i.e. master latch output)  for constraint measurements instead of the Q output of the flip flop. So to force the tool to probe Q output I added following coder in constraint arcs :

# constraint arcs from CK => D
define_arc \
-type hold \
-vector {RRx} \
-related_pin CP \
-pin D \
-probe Q \
DFFXXX

define_arc \
-type hold \
-vector {RFx} \
-related_pin CP \
-pin D \
-probe Q \
DFFXXX

define_arc \
-type setup \
-vector {RRx} \
-related_pin CP \
-pin D \
-probe Q \
DFFXXX

define_arc \
-type setup \
-vector {RFx} \
-related_pin CP \
-pin D \
-probe Q \
DFFXXX

with -probe Q liberate identifies Q as the output, but uses Glitch-Peak criteria instead of delay degradation method. So what could be the exact reason for this unintended behavior ? In my external (spectre) spice simulation, the Flip-Flop works well and it does not show any issues in the output delay degradation when the input sweeps.

Thanks

Anuradha

  • Cancel
Parents
  • Guangjun Cao
    Guangjun Cao over 5 years ago

    If you let Liberate to decide on the criteria and what to probe, the algorithm is not open to public, even I do not have access. The final decision may be different even with different tool versions--- I don't understand why you refuse to move to the latest or newer release.

    10% degradation in delay just means a larger delay. your cell may still be working normally at the 10% increase. What will happen if all cells have 10% extra delay in a high speed chip with millions of cells? If this is still fine, then you can certainly increase the threshold. 

    Guangjun

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • Guangjun Cao
    Guangjun Cao over 5 years ago in reply to Guangjun Cao

    Sorry that forgot to mention,

    what you have observed on changes to criteria, is NOT general. This depends on the circuits/behavior, as well as the tool/versions.

    Guangjun

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • Guangjun Cao
    Guangjun Cao over 5 years ago in reply to Guangjun Cao

    and, change of criteria is NOT unintended behavior. 

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • anurans
    anurans over 5 years ago in reply to Guangjun Cao
    Guangjun Cao said:
    I don't understand why you refuse to move to the latest or newer release

    Well I do not have access to the latest version right now. But the question is not related to the tool version. 

    The delay degradation method is more pessimistic when it comes to constraint measurements than the glitch peak method (hence delay degradation is preferred for ultra low voltage libraries). As I mentioned in my question, I explicitly use the arc definitions (with -probe Q ), so that does not mean I let the "Liberate to Decide". By default the glitch_peak method is disabled and the delay_degradation method is enabled. So I expect the tool to strictly use delay_degradation instead of glitch_peak (as per the V16 and  latest V19.2 manuals). 

    But as you said, if this depends on the tool version too, I will try to run the experiment in the latest version.

    Thanks

    • Cancel
    • Vote Up 0 Vote Down
    • Cancel
  • Guangjun Cao
    Guangjun Cao over 5 years ago in reply to anurans

    Why do you not have access to the latest release? v16 is way too old.

    Which ever method the tool uses, the actual threshold decides the final constraint value. If you want to match constraint values with different approaches, you will have to change the probe/threshold, and maybe other constraint related parameters. if you want to use a particular probe (this can change the criteria and hence may change the threshold), or merit you may do so on arc-basis. please check the manual on define_arc. 

    I would like to remind you this, most of the default settings are set based on big vendor's and most of our customer's requests. However, depending on your needs, eg. you believe the constraint with one method is too conserve, you may change the default, or provide explicit information/settings to the tool. 

    guangjun

    • Cancel
    • Vote Up +1 Vote Down
    • Cancel
Reply
  • Guangjun Cao
    Guangjun Cao over 5 years ago in reply to anurans

    Why do you not have access to the latest release? v16 is way too old.

    Which ever method the tool uses, the actual threshold decides the final constraint value. If you want to match constraint values with different approaches, you will have to change the probe/threshold, and maybe other constraint related parameters. if you want to use a particular probe (this can change the criteria and hence may change the threshold), or merit you may do so on arc-basis. please check the manual on define_arc. 

    I would like to remind you this, most of the default settings are set based on big vendor's and most of our customer's requests. However, depending on your needs, eg. you believe the constraint with one method is too conserve, you may change the default, or provide explicit information/settings to the tool. 

    guangjun

    • Cancel
    • Vote Up +1 Vote Down
    • Cancel
Children
No Data

Community Guidelines

The Cadence Design Communities support Cadence users and technologists interacting to exchange ideas, news, technical information, and best practices to solve problems and get the most from Cadence technology. The community is open to everyone, and to provide the most value, we require participants to follow our Community Guidelines that facilitate a quality exchange of ideas and information. By accessing, contributing, using or downloading any materials from the site, you agree to be bound by the full Community Guidelines.

© 2025 Cadence Design Systems, Inc. All Rights Reserved.

  • Terms of Use
  • Privacy
  • Cookie Policy
  • US Trademarks
  • Do Not Sell or Share My Personal Information