Get email delivery of the Cadence blog featured here
Engineering teams are tracing test failures back to IR/voltage drop during test mode. These false failures are impacting yield, profitability.
We consider this to be a power management issue for test mode and should be approached as early as front-end design and carried through ATPG and pattern/vector analysis and sign-off.
Can anyone share their thoughts/opinions on top considerations to manage power effectively and productively?
So topics are
Looking forward to your input.