• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Blogs
  2. Verification
  3. Changing the Status Quo in SoC to System Hand-off
jasona
jasona

Community Member

Blog Activity
Options
  • Subscribe by email
  • More
  • Cancel
Device Drivers
virtual platforms
Droid

Changing the Status Quo in SoC to System Hand-off

13 Jul 2010 • 4 minute read

As part of EDA360 Cadence is learning how to play a more significant role in the SoC-to-System handoff. To date, Cadence has served the SoC market by enabling companies to design and verify faster, bigger, and better SoC devices that get used by their customers in systems. For a long time we have been hearing that a system is more than just a chip, and for a chip to be successful in the market a large amount of embedded software is required.

Lately, we have seen an uptick in the number of SoC companies interested in using Virtual Platforms to develop software before silicon or even RTL is available. Generally, they want to get a jump start on software development, improve software quality, and tune the architecture based on hw/sw interaction. It seems clear that a significant portion of the deliverable created by an SoC company is software.

At DAC this year I also sensed a change in the way engineers are talking about Virtual Platforms. Much of the past struggle related to the Missing Model Syndrome and the lack of resources companies were willing to commit to Virtual Platform creation has slowly been melting away and users are more interested in taking control of the situation and controlling their own destiny.

Another positive trend is seeing engineers touting the benefits of using Virtual Platforms such as the recent article co-authored by Everett Lumpkin. When I see an article co-authored by a user and a vendor I'm suspicious that the vendor does all the writing and the user just puts his name on it to add some credibility, but not this time. Check out Everett's LinkedIn profile.

Even with all of these positive developments, it's my opinion that the majority of the world is not using any kind of simulation for doing embedded software development. As an example, the talk at DAC 2010 by Iqbal Arshad was one of the most insightful talks that I can remember into the current state of embedded software and system design.

My first thoughts after watching the talk were related to the deliverables between the SoC company and Motorola during the Droid development. I could guess that the SoC provider may have provided many things to enable the system design and secure the design win. Certainly, a running operating system and device drivers was a required part of the delivery, but what about simulation?

Some possible deliverables might be:

  • a complete (but fixed) simulation of the SoC running the OS, drivers, and sample apps
  • a set of models that could be used by Motorola to create multiple simulation environments
  • nothing related to simulation, just send a hardware board running OS, drivers, and sample apps

In order for Virtual Platforms to become universal they must overcome the current status quo. I have 2 daughters that are homeschool debaters. Every year they debate a resolution that involves a proposed change to the status quo. For the 2010-11 year the resolution will be

Resolved: That the United States Federal Government should significantly reform its policy toward Russia.

You can already keep up with the latest Russia news at Blue Book Report. In Team Policy debate the affirmative team must present a plan and convince the judge that the plan is worthwhile and compelling enough to change the status quo.

Today, the status quo for the SoC-to-System handoff is a hardware board running an operating system with drivers and sample apps. In the case of the Droid example it could have been Zoom.

In a typical reference design everything is open source, from the hardware details to many different software projects that run on the hardware. By default, the hardware reference design serves as the starting point for system design. By adding more hardware peripherals, adding more drivers, and taking care of all the important details of the super thin package, antenna design, power management, sensors, manufacturing details, etc. the reference design can be transformed into a commercial product like the Droid. From the DAC talk I gathered that this transformation involved a lot of blood, sweat, and tears and my hat goes off to the entire team that made the Droid a reality; it's a very impressive accomplishment. I could also infer that the last mile was difficult. Issues with performance and power are difficult to track down and I'm sure the Law of Leaky Abstractions came into play. The various abstractions from the hardware all the way to the apps are great until something goes wrong, then Motorola engineers must unravel the stack to find the leak.

Taking a software stack that mostly works but has some hidden inefficiencies related to performance or power in a device driver can be difficult to find and fix. Iqbal explained that all of the device drivers that probably looked fine on the reference platform had to be rewritten.

Some bullets directly from the presentation (slide 31):

  • Debugging Nightmare
  • Everybody needs proto units
  • Upto 6 months in schedule
    Costs in excess of several million $

The presentation is a wonderful industry contribution to provide details on what it takes for any Virtual Platform approach to convince system companies to change the status quo.

Jason Andrews

 

 

© 2025 Cadence Design Systems, Inc. All Rights Reserved.

  • Terms of Use
  • Privacy
  • Cookie Policy
  • US Trademarks
  • Do Not Sell or Share My Personal Information