Get email delivery of the Cadence blog featured here
The creation of behavioral models is only one part
of the process of using those models in a mixed-signal design verification flow. If the
model and design don't match, the effort is worthless. Even worse, it can damage the entire
design verification process.
"Why should I care about keeping
my behavioral models and designs in synch ?"
The benefit of using a bottom-up behavioral model
to improve design verification simulation performance is totally wasted if the
model being used does not match the behavior of the design which it replaces. In
fact, a bad out of synch model can expose verification issues that don't really
exist, or hide those that do.
"How can I ensure my
behavioral model and design have equivalent behavior ?"
Until recently, this required a lot of
manual effort and good discipline (pardon the pun):
This can take many hours of manual effort when and if the design or model change is recognized. The cost later in the design flow if these differences are not recognized can be several orders of magnitude
"I don't have time to
do all that, what can you do to help ?"
Based on the feedback and requirements
from many customers who are actively using behavior model models within the
their design verification flows, we developed amsDmv (Analog Mixed Signal
Design and Model Verification).
amsDmv is fully integrated with
Virtuoso, ADE L/XL, ViVa and SimVision to provide a complete self contained model
verification and debugging solution. Features include:
flow is shown below.
The user provided design, model and test
benches are simulated using the relevant tools and flows within amsDmv to
create the required measured result behavior and waveform signals. As selected
and set up by the user, amsDmv validates:
The waveform validation algorithms are
especially powerful and useful because for many design types, the user does not
need to create specific measurements in the test benches. The algorithms
highlight unacceptable absolute and relative amplitude and time differences. The
amsDmv design vs. model validation, once interactively set up, can be exported
to a UNIX batch script that can be run within a nightly regression run.
If the differences between the design
and model exceed the user provided tolerances, the batch script will fail and
report problem areas that can later be debugged interactively within amsDmv.
The need for design
vs. model validation within a complete mixed signal design flow
Whether the model is created manually or
via Schematic Model Generation (SMG, to be covered in a later Blog), there are many checkpoints within
the design flow where it is important that the model and design are checked to
be equivalent. As the design progresses, the design or model may need tweaking,
or the model may need to be recalibrated to measurements made by simulating the
post-layout extracted view with parasitics.
Once the design has been through layout
and the parasitics extracted, the post layout design can be simulated and the
model recalibrated to create a silicon calibrated behavioral model that
represents the behavior of the finished design. At each of these points in the
design flow, the different abstractions and phases of design and model creation
can be cross validated to ensure consistency.
The onerous time intensive and error
prone task of ensuring that behavioral models and designs stay in synch can be
significantly improved and automated by using amsDmv. This enables users to not only improve design
verification simulation performance, but also to trust that the results they
are seeing do accurately match the real design.
(CIC / CSV Architect)