• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Blogs
  2. Verification
  3. Constraint Layering - Fine Tuning Your Environment - Part…
teamspecman
teamspecman

Community Member

Blog Activity
Options
  • Subscribe by email
  • More
  • Cancel
IEEE 1647
Specman
verification strategy
Verification methodology
Functional Verification
Testbench simulation
e
Aspect Oriented Programming
IES
AOP

Constraint Layering - Fine Tuning Your Environment - Part 2

12 Dec 2008 • 4 minute read

In my last post, I talked briefly about constraint layering in which I gave an extremely simple example of how users can layer constraints on an existing base environment to change how that base environment behaves, all without touching the base environment.

Obviously, we all know that our verification tasks are more complicated than that simple example.  Therefore, in this edition, I want to go over some additional features that can help address some of that additional complexity and further assist in the concept of constraint layering.

Soft Constraints
What happens when you want your "foundation" to have some "default shape"?  This is one role of soft constraints.  Using soft constraints the base code writer can provide some limits to keep the generation from completely going wild.  Then later on a user of that base code can further constrain the environment to make it fit a more specific situation. Imagine the following simple example:

 

<'
//file: base_packet_def.e
struct packet_s {
    packet_addr : uint(bits:16);
};
'>

 

Many of you know that randomization in e is what is referred to as "infinity minus".  This means that a user does not have to specify which fields are randomized but rather every field is automatically randomized to its fullest extent by dafault.  Usually this scheme requires the user to specify limits for that randomization to make the values realistic.

In the context of the example above, this means that the packet_addr can take on any 16 bit value.  What happens if this large range does not make sense for the design being tested?  For example, the address can only be between 0x1000 and 0x1FFF.

One way to implement this constraint is to have the base code writer set some default limits on the address to keep it in the limits specified.  Here is some example code illustrating this concept.

 

<'
//file: base_packet_def.e
struct packet_s {
    packet_addr : uint(bits:16);
    keep soft packet_addr in [0x1000..0x1FFF];
};
'>

 

Now when used without further constraints the packet address stays in the valid range specified earlier.

You might be asking yourself "Why use a soft constraint and not a hard one?".  Well the answer comes back to the concept of layering.  We said that the address was only legal in a certain range.  However, as verification engineers we may want to test illegal cases for example.  If the base code set uses hard constraints, then a test writer would have to change the base code to allow setting the address to a value outside that range.

Furthermore all of the "legal" tests would have to include a constraint to keep away from the newly added illegal value.  Using soft constraints and constraint layering the following example provides a much more elegant solution.

 

<'
//file: base_packet_def.e

struct packet_s {
    packet_addr : uint(bits:16);
    keep soft packet_addr in [0x1000..0x1FFF];
};

<'
//file: env1_test.e
extend packet_s {
    keep packet_addr == 0x3500;  
};

'>

 

In this example, the test writer was able to force an illegal value without changing the base code.  Also note that the rest of the existing code will adhere to the limits in the soft constraints and generate legal addresses without further constraints.


Reset_Soft()
Many of you have probably played around with the soft constraint mechanism described above.  If so, you have undoubtedly faced another real world example that also applies to our discussion.  What happens when those default limits are no longer valid or have changed?  Here again there is an opportunity to leave the base code alone and layer on some additional code to set a new default range.  For this example assume that the address range has now changed to 0x3000 to 0x3FFF.  We will keep the existing base code file found above but this time, add a configuration file to change the defaults and adjust the illegal value test.

 

<'
//file: base_packet_def.e
struct packet_s {
    packet_addr : uint(bits:16);
    keep soft packet_addr in [0x1000..0x1FFF];
};
'>

<'
//file: env2_packet_config.e
extend packet_s {
    keep all of {
        packet_addr.reset_soft();
        soft packet_addr in [0x3000..0x3FFF];   
    };
};
'>

<'
//file: env2_test.e
extend packet_s {
    keep packet_addr == 0x4500;
};
'>


In this example the environment config writer (First Layer) was able to "remove" the soft constraint from the base environment and apply new ones.  Then the test writer (Second Layer) was able to apply the specific constraints that were needed to produce an illegal test.


Hopefully this session has inspired you to think some more about areas where you can use constraint layering to facilitate your verification code reuse.  As I mentioned in the last post please feel free to get involved in this blog and submit back any comments, questions, or topic ideas that you may have. This blog is for you (the user community) and as such I hope that you will become involved to help shape it going into the future.

Until next time!

 

Brett Lammers
Advanced Verificacation Core Competency Team Member
Cadence Design Systems, Inc

© 2025 Cadence Design Systems, Inc. All Rights Reserved.

  • Terms of Use
  • Privacy
  • Cookie Policy
  • US Trademarks
  • Do Not Sell or Share My Personal Information