Get email delivery of the Cadence blog featured here
Tuesday, June 20, 2017, Austin Convention Center. Noon.
I’m wilting in the heat. The temperature outside will get up to 95°F today, with 71% humidity. I’m such a delicate California flower, durn it; never have I been so thankful for Texas air conditioning. I’m also over-caffeinated and underslept (having finally arrived at my hotel in Austin at 12:30am), but happy to be here to experience my first DAC. With my first CDNLive being the first blog post, I’ve now been here for six months. It’s been quite a trip so far.
I am sitting in the cool, dark ballroom at the convention center, eating sandwiches and chips with about 400 others at today’s Cadence-sponsored luncheon; the theme is verification. (Frank Schirrmeister wanted to call the panel “Making Verification Smart Again”, but was overruled – so it was officially called “Towards Smarter Verification”.)
Smoothly moderated by Ann Mutschler of SemiEngineering, the panel consisted of:
In a very small nutshell, each of the panelists highlighted the importance of the evolving approaches they’re taking to their verification methodologies, and all of them were excited about using “machinelearningdeeplearning” as the ways to achieve their long-term goals.
All of the panelists came back to a couple of recurring themes throughout the discussion:
If you haven’t noticed, I tend to think and write in metaphor. I have talked about coding and knitting, what A2 will look like to the average person, and what AI and language translation have in common. And here at DAC17, I feel another metaphor coming on… Bear with me here.
In cooking reality shows, the challenges given to the chef contestants seem to fall into one of two basic categories: Either they are given a limited number of possibly disparate ingredients and they create a new masterpiece in their wee competition kitchens; or they are given perhaps unusual circumstances with an unlimited number of ingredients, and they come up with some new pièce de résistance.
In my kitchen, I also have two different approaches to cooking:
a.) Scrounging. I look in the fridge and pantry and see what I can create from what is already there; or
b.) Planning. I want to make a specific dish, so I plan ahead to get everything that I need to make that thing in the time frame that I need it.
The first approach is generally used on weeknight dinners or when a [chocolate] craving hits; the second is generally when we have guests. Both ways work. Either way is valid. They both require a certain amount of skill and planning. They both need fresh ingredients to create an adequate product. Each has their own kind of positives and negatives. But if I’m making something complicated? I get much better results using the latter method.
This perfectly illustrates the “bottom-up” and “top-down” approaches to designing and testing complicated systems.
The bottom-up approach considers all of the little things that must be designed and tested and debugged, rummaging around in the back of the pantry and behind the condiments. This method allows me to use yogurt if I don’t have enough milk. If I don’t have dark chocolate, cocoa powder and a little oil will do. And depending on my skill and experience (and oh is it vast), the result can be kind of tasty, nourish my kids, and do away with the craving. Or I might have to do without, should the milk be sour or I gamble on some taste combination that… um. Doesn’t work. (Don’t ask my kids about the time I used grape jelly instead of sugar.)
The top-down design approach starts with a holistic view of the complicated system, identifying the functionality of the complete project (say, a large family gathering in November). As the process continues, the system is broken into progressively smaller pieces (turkey, stuffing, gravy, pumpkin pie, cranberry sauce…). Keep breaking them down until you can make a shopping list: you’ve identified the smallest pieces that will make up the whole. Finally, when the pieces are completely designed, debugged, washed, mixed, tasted, sauced, roasted, whipped, and verified, they’re hooked up together again to create the complete system as it was originally scoped: Thanksgiving dinner.
Cadence offers tools for both methods. That said, considering the complexity and scope of the EDA industry, using a top-down approach will make the most sense moving forward, at least in terms of smart verification. Verify that the sour cream is fresh before you mix it into the chiffon pie, not afterward. Try not to discover that you should have brined the turkey for 12 hours before you open the cookbook on Thanksgiving morning. Don't call your mother because you can't find the recipe for cranberry relish.
And please help the chef if the kitchen is a mess after the apple pie.
 Just to orient those of us at the entry level of the EDA world, the verification stage is used to test that the RTL is correct. These days separate engineers run the tests from those that write the RTL. In practice, verification is never finished, and so takes place in parallel with the rest of the chip design, with any errors discovered being incorporated into the design at regular intervals
 In this case, “shift left” means to find as many bugs as possible early in the design cycle (the “left” of the timeline), since they are much cheaper to fix before a lot of work has been done using the old RTL with the bugs