• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Blogs
  2. Breakfast Bytes
  3. What Next for Modus DFT?
Paul McLellan
Paul McLellan

Community Member

Blog Activity
Options
  • Subscribe by email
  • More
  • Cancel
modus
Test
scantest

What Next for Modus DFT?

30 Jan 2019 • 3 minute read

 breakfast bytes logo I sometimes say that test is the red-headed stepchild of EDA, that doesn't get the same glory as the more high profile parts of the EDA flow such as synthesis, or place & route, or signoff.

Test History

Over the years, how we do test has been through a number of eras:

  • Prehistory: A test engineer who really understood testers would start from scratch and write a test program for the chip. They had a lot of their own terminology (such as a "mask") that confused designers, but the designers didn't really talk to the test people anyway.
  • Stone age: Analog chips remained in the pre-history stage, but digital chips were now simulated. The vectors used for that simulation could be used to automatically generate the test program.
  • Late stone age: The problem with just using the simulation vectors is that nobody had any idea how effective the vectors were and finding manufacturing defects. So fault grading was added to the mix, and the simulation vectors would be run through a fault simulator which would determine what percentage of faults (stuck-at-0, and stuck-at-1 for each gate output) were detected by the program.
  • Bronze age: Getting test coverage to a reasonable level was becoming very difficult, and required a lot of manpower at a point in the design cycle where the delay fed right into the schedule. The solution was to automate the process by using automated scan test. This required restrictions on the design (race and glitch-free designs, for example) but they were not too onerous.
  • Iron age: Scan test was automatic but it created test programs were very long (a lot of test vectors). The long running times of the test programs meant that a lot of testers were required, and the cost of testing a chip was starting to surpass the cost of manufacturing it. The solution was test compression, a small amount of additional hardware on the chip that would expand each vector applied to the pins into a lot of vectors fed into the scan chains, and compress a lot of vectors coming out of the scan chains into a single vector to be read out of the pins. The additional hardware was known as the compressor-decompressor, or codec.
  •  Modern times: Compression factors got up to 50:1 but it is hard to go much higher in the general case, although sometimes extremely high ratios were achievable. More compression tends to be offset by requiring more vectors to achieve coverage. Another problem is that the additional circuitry, the codec, tended to end up in the center of the chip since it had to be connected to everything (see the picture on the right).  The center of the chip already was to be the most congested, since signals that crossed the chip naturally tended to flow through there (the same reason the center of a city has the most traffic congestion).

Modus times: Cadence's Modus DFT Software addresses these issues by integrating test with physical design. Instead of the gates that create the compressor/decompressor logic being grouped in the center of the chip, they are forced to be spread around the chip in a way that is efficient from a layout point of view. To increase compression ratios further, what we call elastic compression, sequential elements are added to the codec too, to make it possible to preserve and values between vectors. For more details on Modus, see my post Modus Test Solution—Tests Great, Less Filling. 

Test Gets Some Respect

I opened by saying that test generally doesn't get the same respect as "sexier" areas of EDA. But that is not correct this year. The Kaufman Award for 2019 went to Tom Williams, who is one of the fathers of scan test, the way all digital chips are tested today. I wrote two posts about him, one, when I interviewed him just after the award, was announced (see my post Figure-Skating Champion Wins Kaufman Award), and once after the award ceremonies (see my post Kaufman Award Dinner: The Tom Williams Story).

One of the people who worked with Tom over his career, starting in early days at IBM, was Rohit Kapur. During the award dinner, he talked about working on compression with Tom. A couple of weeks ago, he talked with me.

I interviewed him for a video on what he sees as the current issues with test, and how Modus is addressing them.

More Information

There is lots, including more videos, on the Modus DFT Software Solution product page.

 

Sign up for Sunday Brunch, the weekly Breakfast Bytes email.