Get email delivery of the Cadence blog featured here
week saw the publication of two interesting blog posts regarding the growing
challenges of FPGA verification, first
from my buddy Dave Orecchio over at GateRocket and then
from my Cadence colleague Steve Leibson. Both posts made the point that FPGA developers
are increasingly facing the same verification issues as developers of
non-programmable devices. This trend has been evident for quite a few years,
but the number of FPGA users affected has grown from a tiny fraction to an
entire upper tier of developers whose design size and complexity rival or even surpass
many ASIC and SoC projects.
I read posts such as these, I immediately think back to a visit I made some
years ago to an FPGA-based design center facing a verification crisis because they didn't realize that their world had changed. I've
made well over a thousand customers visits across my last four jobs, but very
few stand out as clearly as this one. I actually wrote an article
about it back in a previous life, but for a number of reasons I did not go into
a lot of detail. With the more personal blog format, and the passage of enough
time since my visit that I'm sure no one will identify the company, I can now fill
in some of the gaps.
prospective customer was a cable television headend manufacturer.
They made an FPGA-based system to collect video from various sources (mostly
analog at the time), integrate them with channel guides and local commercial
inserts, and send the result out to the subscribers. They were in the process
of developing their next-generation system, which included such digital features
as video-on-demand and was therefore much more complex. When I visited, they
had the first prototype of their new headend system in the lab and were in the
bring-up process. Sounds like an exciting time for the engineers, yes?
not so much. The problem is that they had been in the lab for more than six
months and still did not have a working prototype. This was their
verification crisis, and so I talked with the engineers to see if my company could
help. I learned that their designers wrote the VHDL code and then did some
minimal testing by hand-writing binary vectors for the inputs and examining
waveforms on the outputs. Their entire EDA tool suite consisted of inexpensive
VHDL simulators and the FPGA vendor's tools running on a handful of Windows
PCs. They had no real verification plan, no verification methodology, and no
dedicated verification engineers.
the past, this had always been good enough. The designers did their simple
testing, built a prototype or two, programmed the FPGAs, and did most of their
debugging in the bring-up lab. When they hit a bug, they reprogrammed the chips
and tried again. But with their more complex design they had absolutely hit a
wall. Their visibility into the larger FPGAs was limited, so they went through
a lot of "burn and churn" loops just trying to access the right signals to debug
each problem with a logic analyzer. Then they spent more loops trying out each
proposed fix, since they weren't able to reproduce the problems or verify the fixes
in simulation. The result was six months of frustration working on the
prototype for a product whose market window was rapidly shrinking.
were frankly panicking, looking for anything to help. I discussed how formal
analysis was really good at finding the root cause of bugs, even post-silicon.
If the failure behavior seen in the lab could be captured with an assertion,
then formal could generate the exact sequence leading to the failure, with full
debug visibility into the VHDL design. They listened attentively and gave this
approach some thought, but they simply didn't have the funding to upgrade their
simple EDA setup to powerful workstations and more expensive ASIC-style tools.
I agreed with our Sales team that they were not a good prospect, and never spoke
with them again.
I don't know what happened on that project, but I've checked a few times over
the years for any trace of that company and have found nothing. I suspect that
the market window for their new system closed before they had a product ready
to ship, and that was the end. But I do think of that visit whenever FPGA
verification is discussed. Every year, more and more FPGA designs are moving
into hazardous size/complexity territory. Some development teams will adopt the
proven verification techniques of the ASIC and SoC world: constrained-random
stimulus, UVM/OVM, verification planning and management, and metric-driven
verification. Others will hit the wall and get squashed as surely as a June bug
on a windshield at 70 MPH.
you're an FPGA developer moving to larger devices, I can only hope that you
heed my sad story and address your growing verification needs before they