Get email delivery of the Cadence blog featured here
With lower process geometries and exponential growth in test complexities, associated costs (risks), and aggressive DPPM goals (test escapes):
1) Are today's fault models sufficient to enable defect detection? What are the most common reasons for field escapes
2) Are there any concerns of over-testing due to pessimistic or over-conservative fault models? Do you think too many good chips are wastefully thrown away?
3) Structural and defect-based testing vs. functional tests. What are the scenarios where there are no alternatives to functional tests?
Please share your thoughts. I've been in the IC design and manufacturing industry for nearly 20yrs and would like to gain more insight into most recent challenges as well as thoughts, beliefs, findings (emperical or otherwise) ...
Getting back to fundamentals...! It is heartening to see that many in the test community are re-examining some of the basics. Implicit in the questions you are asking is the issue of understanding how we are testing basic logic elements. While much of the recent press on test has focused on compression and system-level issues, it turns out that much is being left on the table in existing test methodologies becuase of the assumption that "we already have the basics right". It will be very interesting to watch the light bulbs go off around the world... ;)