• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Blogs
  2. Verification
  3. Is Concurrent Engineering actually getting worse?
jasona
jasona

Community Member

Blog Activity
Options
  • Subscribe by email
  • More
  • Cancel
Concurrent Engineering
System Design and Verification
ISX

Is Concurrent Engineering actually getting worse?

14 Aug 2008 • 3 minute read

Today I'm taking a few minutes to jot down a few recent observations about the state of concurrent engineering as it relates to hardware and software verification.

Everybody knows that the opportunity exists to improve time-to-market if projects can get more of the hardware and software done in parallel. Practically, this mostly means getting the software done sooner since it's software that is usually last in the integration process. 

A couple of months ago I wrote an article about ISX usage by one of our best customers. I'm probably letting out any big secret here, but lots of articles about EDA tool usage are either partly or mostly written by the EDA vendors themselves (and even by R&D people like me) based on information gathered from users.

One of the tidbits of information this ISX user communicated was the software for his projects is getting more and more complex and becoming available later and later. The need to do better and earlier integration of hardware and software has been known for a long time, but due to increasing complexity the availability of software seems to be getting later and later.

This makes it impossible to just "get all the software and run it" on an emulator or an FPGA. This trend of partial software availability might be bad for the companies trying to deliver chips, but may lead to more demand for ISX to create a verification environment that can run with only partial software available. The ability to build a verification environment that can connect to both hardware and the partial software has been one of the primary uses so far of ISX. In this application ISX serves as the "missing software" to provide stimulus and checking for the software that is available.

In the last two weeks, I had meetings with software engineers from two leading companies.  One is probably classified as a chip company that developers a lot of software; and the other a system company that also designs the chips for the system.

In both meetings no hardware engineers were present. As we discussed pre-silicon software development and software verification I received very similar responses from both.

During pre-silicon software development the focus is actually NOT on any type of verification, but rather showing positive results of targeted features combining hardware and software. The approach is to get software that is working for most of the high probability use cases, and stop there. It seems to be a defensive approach to demonstrate that the hardware is good enough to implement the feature so there are no show stopper issues with the hardware and the software can always be tweaked later. 

Both companies use a mix of Palladium emulators and abstract virtual system prototype models. One simple example might be to just boot the operating system on an emulator. They both told me the serious software testing starts after silicon is available and they can run at full speed in the complete system environment.

What struck me was the lack of effort to find software bugs earlier at a time when the entire software industry (including Cadence) is pushing for higher quality software sooner. Surely, the cost of finding and fixing problems in a post-silicon lab environment must be very high to finding and fixing problems by the original developer much earlier in the project.

Today, I received an e-mail from Wind River about a seminar. The headline was an eye-catcher, even if the answer was obvious: "Have you inadvertently deployed products with software defects?" Of course every software product has defects, but the topic of dynamic testing is a great one for embedded software. They also talk about "function coverage". In ISX we talk about "functional coverage" for software. Even though it sounds the same, but is not the same, it's good to see more emphasis on embedded software testing.

As always, inputs are welcome:

  1. Is concurrent engineering between hardware and software getting better or worse at your company?
  2. Is there a push to get higher software quality sooner?
  3. Do you think software verification based on constrained random generation, checking, and functional coverage would help create higher quality sooner?
  4. Is true software verification mostly a post-silicon effort?

© 2025 Cadence Design Systems, Inc. All Rights Reserved.

  • Terms of Use
  • Privacy
  • Cookie Policy
  • US Trademarks
  • Do Not Sell or Share My Personal Information