Get email delivery of the Cadence blog featured here
Embedded software development has become the main pain point in many of today's complex systems on chip (SoCs). Is it time to recognize software as the "center of gravity" in SoC development, and to work towards a continuous co-development of hardware and software throughout the design and verification flow?
Jim Ready thinks so, and you should listen, because this is the man who developed the first commercially viable real-time operating system (RTOS) and who later pioneered embedded Linux commercialization. Today, Ready is Chief Technology Advisor for Software and Embedded Systems at Cadence. He recently wrote an article for RTC Magazine that details his thinking about a "software-driven SoC development" flow.
In early 2013, I blogged about "software-driven verification," which is one aspect of software-driven development. With software-driven verification, engineers use software running on an embedded processor model to build testbenches for hardware verification. It's a reality today but is generally an ad-hoc methodology with minimal tool support. But software-driven verification is just one part of a broader vision, and I recently talked with Ready to learn what that vision is.
The Tyranny of Software
According to analyst Gary Smith, the industry finally has a "true" electronic system level (ESL) design and verification flow. The Cadence System Development Suite (see figure below) provides four tightly linked hardware/software development platforms including virtual prototyping (Virtual System Platform), simulation (Incisive), acceleration/emulation (Palladium XP), and FPGA prototyping (Rapid Prototyping Platform).
With all these efforts to facilitate early software development, what could possibly go wrong?
Massive amounts of software, that's what. "The driving force is the tyranny of software and what it takes to execute it," Ready said. He noted that it may take a billion instructions to boot Android or Linux, and then you have to keep running the OS in order to get to the software you actually want to debug. The amount of software that needs development and debugging has become so massive that it threatens to roll back some of the progress that's recently been made.
Moreover, Ready noted, there is "a gap between theory and practice" at many semiconductor and systems companies. People have been talking about hardware/software codesign for more than 20 years, but there are many reasons why it's often not done. For example, there are structural issues in the industry, one of which is the independent development of hardware and software. Android is an example of such an approach.
With Android, Ready noted, "your hands are tied - it has whatever interfaces it has, and they may or may not match the hardware you're developing." While SoCs today are comprised mostly of reused semiconductor IP, most have some custom-designed features as well. If hardware designers are not considering the impact on software, they might design a hardware subsystem that is not recognized by Android.
Hardware engineering change orders (ECOs), as inevitable as death and taxes, are another problem. "A small change in hardware can make a huge difference," Ready said. "Changing one register can end up requiring a costly rewrite of a driver, and potentially parts of the Linux kernel." With a software-driven development approach, the impact of a hardware change on software would be a key consideration.
What's the Ideal Situation?
A software-driven SoC development approach is conceptually fairly simple. Engineers and architects start with a system design. They define some functionality, they select an OS, and they partition hardware and software. "At a minimum, the software guys are in the room," Ready said. "Best case is where there's a virtual interface that doesn't change. The center of gravity is a massive amount of software, and you want to keep it as constant as possible. That's what layering and virtualization and drivers are all about."
As a software developer, Ready said, "I don't want to know which hardware-compliant device driver I have. I want to forget all that, and let the virtualization do that." SoCs change quickly, Ready noted, and what's required is "the ruthless abstraction of all possible devices." This means hardware can change and software developers might never notice the difference.
One hallmark of a software-driven flow is "meaningful" software execution speeds. Further, software execution must be supported during all phases of SoC hardware development. The System Development Suite is a huge step towards this goal, but there is more to be done. For example, the Palladium XP emulation system offers a unique combination of speed, RTL accuracy, and signal visibility, but it can still take many hours to boot Android or Linux.
Hybrid Approach Accelerates Software Execution
In September, Cadence introduced the Palladium XP II emulation system. It allows a new "hybrid" use model that combines virtual platform models with emulation, and it's a good example of software-driven SoC development. With the hybrid model (see diagram below) fast processor models run on a host workstation, while RTL runs inside the emulator. Customer experience suggests that hybrid mode users can boot an OS up to 60X faster, and run hardware/software verification up to 10X faster, compared to conventional emulation use models.
At the Design Automation Conference in June 2013, a speaker from AMD showed how his company is using the hybrid mode (called "virtual platform co-emulation") with Palladium. A previous blog post reports on this experience.
Thanks to the hybrid model, Ready said, large amounts of software can run with the Palladium. You can get through the OS bring-up process quickly and then "turn on the magnifying glass" to look at RTL code with full emulator visibility. At this point you're limited to emulator speeds (about a MHz) but you are able to get to what you want to see a lot more quickly.
More needs to be done to bring software to center stage in the SoC development process. Virtual prototyping is enabling early software development at many companies, but there's always a need for more models. FPGA prototyping runs much faster than emulation, but it requires mature RTL, and it takes time to partition that RTL code into FPGAs. What's needed, Ready said, is "the integration of all the execution environments and the optimization of them."
As for Cadence, Ready noted, "in each case we have a strategy for hitting the limitations that are exacerbated by large amounts of software on all of our execution environments. And that's pretty neat."
For a deeper perspective on software-driven SoC development, see Jim Ready's RTC Magazine article.
Related Cadence Blog Posts
Q&A: Jim Ready Discusses EDA Connection to Embedded Software Development
Software-Driven Verification - a Hot Topic for 2013?
Gary Smith Webinar: "The True ESL Flow is Now Real"
Palladium XP II - Two New Use Models for Hardware Verification
Designer View: New Emulation Use Models Employ Virtual Targets