Get email delivery of the Cadence blog featured here
Virtual platforms are becoming increasingly important for pre-silicon hardware/software verification and integration, but they pose some ongoing technology and business challenges for the supplier/user ecosystem, according to panelists at the recent Design Automation Conference. Panelists discussed issues such as model interoperability, debugging, software verification, and support for multiple abstraction levels at the Cadence Ecosystem booth Tuesday, July 28.
Moderator Ran Avinun, marketing group director for system design and verification at Cadence, posed a number of challenging questions for panelists from STMicroelectronics, ARM, Virtutech, and Cadence. Although I’ve written about virtual platforms during the past couple of years, I gained some new perspectives from their answers. Following are some interesting points that provide an update about virtual platforms and show what still needs to be done.
For markets with fast time-to-market, virtual platforms have become indispensible.
STMicroelectronics has to go from concept to silicon in less than 12 months on consumer products, said Laurent Ducousso, IP verification and software development platform manager. “We need to develop hardware and software concurrently, and the only way to do this is such a short time scale is to have a model of the design. The only way to anticipate software development is to have a virtual platform.”
Virtual platforms help with both hardware and software development and validation.
Recent press about virtual platforms has focused on their role for early software development. However, several panelists pointed to the hardware/software interface as the place where the toughest challenges lie. “It’s all about the boundary,” noted Michel Genard, marketing vice president at Virtutech. “It’s about how you bring hardware and software together. You can’t just look at it from a hardware perspective or a software perspective.”
Model interoperability is crucial – and OSCI TLM 2.0 is just a start.
The Open SystemC Initiative (OSCI) transaction-level modeling (TLM) 2.0 standard is “a good start, but is still lacking lots of things,” said Jason Andrews, architect at Cadence. “Yes, models interoperate and can run together, but can you interoperate all the debugging tools?” Joe Krech, product manager for Fast Models and ARM, pointed to the OSCI Configuration, Control and Inspection (CCI) working group as a “next level” that can help with run-time debug control.
Genard said Virtutech is “very pleased” with OSCI TLM 2.0 but wants to see more happen with respect to model interfaces and “best practices” for designing models. Ducousso said that STMicroelectronics is pushing for TLM 2.0 adoption, but is also actively supporting the IP-XACT format (developed by the SPIRIT Consortium) as a way of packaging and reusing models.
Virtual platform panel participants included Ran Avinun (podium) and Laurent Ducousso, Michel Genard, Jason Andrews, and Joe Krech (left to right).
For real success with virtual platforms, embedded software verification needs a better methodology.
Yes, virtual platforms allow early software development and debugging, but current software verification methods are ad-hoc at best. “In embedded software verification, we boot the system, look at the screen and see if the program is running,” Andrews said. “Verification has to evolve to the point that automation is much better.”
Genard said that as software developers use virtual platforms, they’ll be exposed to verification methodologies used in the hardware side, and may then adopt more formalized methodologies. He mentioned the recent agreement between Cadence and Virtutech, which links Cadence Incisive Software Extensions with the Virtutech Simics environment, as a step in this direction.
Virtual platform environments need to support, or connect to, different abstraction levels.
The fastest way to run virtual platforms is to use untimed TLMs, but there comes a time when you need cycle accuracy and timing information. Further, in many cases, only RTL code is available for IP models. Ducousso said that STMicroelectronics addresses these problems by running emulation in a “hybrid” environment with virtual platforms.
Andrews talked about the need to “mix and match” models running at different levels of abstraction, including RTL. “If we architect transactional interfaces right, the user will be able to mix and match abstraction levels as parts of the design become available,” he said.
The virtual platform business model is still evolving.
Who takes responsibility for the full virtual platform, including models and tools? “I don’t think we’ll see a single point of ownership – the value chain is going to be complex,” Genard said. “What’s unknown is the business model.” One model that won’t work, he suggested, is one in which virtual platform providers make their money by selling models. “You want to be in a position where the value you provide is coming from the deployment of the virtual platform and the tools,” he said.
There seemed to be general agreement on one point – IP developers should provide IP models. Krech talked about the advantages ARM provides by developing its own IP models. “The best guy to develop the model is the IP provider himself,” agreed Ducousso.
So, in summary, what still needs to be done? A short list: expand TLM standards efforts to include debugging, develop transactional interfaces for modeling at mixed abstraction levels, develop a more formalized software validation methodology, provide IP at the right levels of abstraction, and work out a business model that has long-term viability. A deeper challenge was identified by Joe Krech. “We still have hardware-oriented people and software-oriented people,” he said. “I think there are ways of educating and closing that gap.”