The post A Perspective on Perspec earlier this week introduced the idea of Perspec System Verifier, that if you have a fairly abstract model of the system at a very high level, you can automatically generate scenarios to verify high-level functionality. The example used was to check that cache coherence is maintained across the power down and power up of one of the participating cores. Obviously a blog post like this is not a training course. I am just trying to give a flavor of what is required to do the modeling and then what can be done with the model afterwards.
Here is the block diagram of a simple system, some sort of video camera.The system functionality is captured by the dozen operations on the right: the camera can capture a video, the modem can transmit data, the graphics processor can display, the audio subsystem can play and record, and so on. Using UML, each of those functions needs to be described in a little more detail, but this is nothing like building a behavioral model for a virtual platform, something I spent several years of my career working on.
Here, for example, is the USB read and write:
Where Perspec System Verifier is so powerful is that this simple semantic of actions, inputs and outputs, and resources can be analyzed by tools and enables automated use-case creation. For example, let's say that you want to create a use case which is to decode video from the DDR and show it on the display. It turns out there are not lots of ways to do this in our simple system. In fact, Perspec System Verifier can create a UML-like diagram automatically to do just this:
Further, Perspec System Verifier can automatically generate the code, the coverage model, and more to actually run these tests in C and SystemVerilog. Building parallel use cases becomes simple. For example, if you have a test of the audio subsystem and a test of the video subsystem, you might be concerned that if they are both running together it might not work cleanly. Too much demand on system buses, memory architecture unable to handle it or something. This sort of thing is very hard to schedule using traditional verification.
To run this test, Perspec System Verifier will:
Just as constrained random with UVM works well, ensuring that determinism of a small number of test cases doesn't lend a false sense of security, so Perspec System Verifier adds a lot of randomization, such as running tests on all the cores, altering the timing, and so on. Perspec System Verifier then generates all the code required to run on the processors so that the scenario can be run on a virtual platform, an emulator, the real silicon, or a SystemVerilog simulator such as the Incisive platform.
Putting it all together, here is the usage flow:
This has been a brief attempt to show you how powerful this approach can be. The model is not hugely complex to create, and once it exists then Perspec System Verifier can use formal approaches to reason about it and thus make many things unnecessary to do by hand. It then automates creating all the tests, which can be enormously complicated for a multi-core CPU with DMA peripherals and the like. For many types of designs, there are libraries of elements already modeled (or at least partially modeled), so it is not necessary to start from scratch. In fact, partly as a test example and partly because it is so important, there is a full library of flexible content for ARM®-related challenges. Much of the modeling is reduced to filling in Excel spreadsheets with the specific configuration (bus widths, core count, etc.).
ST presented on their use of Perspec System Verifier at DVCon last year. Their bottom line: "the learning curve is days, nothing like the complexity of UVM." They thought the productivity improvement was 20X.
Next: SEMICON Best of the West: Coventor
Previous: Palladium and Protium, the Hardware Twins