Get email delivery of the Cadence blog featured here
Jim Hogan has been a mover and shaker in the EDA industry since long before the term "EDA" was invented. Today a well-known independent venture capitalist, Hogan previously ran both R&D and marketing for the Cadence Virtuoso product, and he knows the custom/analog world well. He's invested in companies with portfolios ranging from behavioral synthesis to design for manufacturability (DFM). In this interview he talks about custom/analog challenges at advanced nodes, DFM, power, analog "automation," and the parasitic-aware design capability recently introduced by Cadence.
Q: Jim, can you tell us about your background?
A: I was a semiconductor device physics guy for 10 years, and then I got into the early days of EDA and was responsible at National Semiconductor for the investments we had in nascent EDA companies like SDA and ECAD [which combined in 1988 to form Cadence]. I got more and more involved in CAD stuff because I became director of CAD at National. Then [SDA founder] Jim Solomon asked me to come and become the marketing VP of the analog group that was forming at Cadence. Charlie Janac was the marketing VP but was moving to Europe to do sales for Cadence.
I had a series of jobs at Cadence over the years. I ran IC marketing, I ran R&D, I ran all the field application engineers, I was president of Cadence Japan, and I handled investments and M&A for Cadence. I worked on a public offering for a company called Artisan that was later sold to ARM. Then I became involved in Cadence's captive venture firm, Telos. After that I went off and did my own investment firm with Scott Becker (former CTO of Artisan) with a fund we call Vista Ventures. Companies we've invested in that have EDA technology include Brion Technologies, ClearShape, Ponte, and AutoESL, which was sold to Xilinx a month ago.
Q: As custom/analog designers move to advanced nodes, what challenges are they struggling with?
A: From the beginning of the [Cadence] analog group, we wanted to help the customers bridge the physics gap. We took care of the physics and handled process rules. These days the custom designer is building much bigger circuits, and block-level capacity is an issue. There's a lot of variability in the manufacturing process itself, coming from materials and lithography, and it requires that a lot more corners be characterized. That leads to a huge capacity problem and a huge simulation requirement.
I think Cadence has done a good job upgrading Virtuoso to be more electrically driven, and to understand the value of transistor placement. The more automation you put into people's hands, the better. For analog designers, unlike digital designers, there is no "correct" answer. It isn't binary. With analog, designers look for the "most correct" answer. This is a function of the analog engineer's ability to diagnose and consider design options. As a result, any time or productivity that can be gained is welcomed.
Q: Are new problems occurring at 28nm and below?
A: There's an explosion of design rules. And from an electrical standpoint, the effect of nearest neighbors -- with proximity effects coming from advanced lithography -- has made things incredibly difficult. On the materials side, the transistors are leaky as heck, and how do you take full advantage of process capability when the process has a very narrow distribution? Leakage will narrow the transistor's optimal operating performance. There's not a lot of opportunity to differentiate yourself. The better characterized the transistor, the more opportunity to explore the design space.
Q: We typically hear about DFM in a digital context. What are the challenges in the custom/analog area?
A: DFM tools are probably even more required [in custom/analog] because topology plays such a major role in circuit performance. And electrical performance is a huge issue. As people try to build better transistor models, we have to have well characterized models, and your simulator has to be high capacity because there are so many corners. Design of experiments is an interesting thing to work on. We have to be intelligent about choosing the minimum number of corners that reflect the circuit and process capability.
We have to worry about materials -- the thinning of wafers, and the mechanical effects of that. We need to take a look at what we've considered to be second-order or third-order effects. Typically, we look to foundries to provide the information and the models. EDA tools need to give us a meaningful debug capability in a world that's getting more complex both in the absolute number of instances to look at and the parameters to be considered.
Q: Power-aware design is also usually discussed in a digital context. How does it apply to the analog world?
A: Analog circuits are generally smaller in terms of area than digital logic, so the available design techniques are on a much smaller scale. You don't have a lot of architectural bandwidth; for example you can't trade off software and firmware to reduce power. When analog designers worry about power, they drive most of their power management opportunity from the process capability. Power also factors into the transistor performance due to heat. Therefore power tools are really important. Cadence has a bunch of them, and it's interesting that Apache, which had power tools as its original offerings, is now going public. The power problem is only going to get more difficult.
Q: Is "analog automation" an oxymoron? Analog "synthesis" hasn't worked yet - will it?
A: Let's use an analogy to memory. If you go back to the early 1990s, there weren't many memory compilers. It was still the domain of PhD theses. Now everybody ships a memory compiler. When you talk about analog synthesis, I think you need to think more about memory compilers than logic optimization programs. First of all you've got to have the ability to generate physically analog circuitry from a compiler with a behavorial description. An analog complier, for example, requires a physical symmetry to assure that everything is the same electrically. Today we see this capability from Cadence and others.
Number two, and this is where Cadence has taken a leadership position, is that you've got to come up with the ability to describe circuitry in a behavioral language like VHDL-A or Verilog-A. I think that's caught on for a lot of people. If you go a level above that, say to Matlab, you describe an arithmetic function. Can you take that Matlab arithmetic function to a Verilog-A implementation? That's out there a ways, but going from Verilog-A to an analog compiler is arguably available today from Cadence.
Q: The recent Cadence Virtuoso IC6.1.5 announcement emphasizes parasitic-aware design, which allows a circuit designer to do a quick layout and get parasitic information. Is this a valuable capability?
A: Making an electrically-aware layout has been a dream for a long time. The layout can be done faster and be a lot more convergent. It's very promising that Cadence has delivered that. It's one of those ideas that goes back at least 15 years and it is really nice to see it delivered by the new release.
Q: Another new capability in Virtuoso IC6.1.5 is in-design manufacturing signoff. What's your view of this capability?
A: If you make a move in a layout tool, you want to know if the move is legal in terms of rules and DFM effects. So doing that in situ, as opposed to a Calibre post-layout approach, is a huge advantage in terms of productivity and correct by construction. Any time you can give the designer feedback that something will hurt -- or not - it's a good idea. We had Diva [interactive DRC] for a long time, but this is much more sophisticated than Diva and is a huge advantage for Cadence.
Q: What's your overall view of Cadence in the analog market today?
A: Cadence has a lion's share of the market. There have been a few attempts to displace Cadence, but having a "me too" product doesn't cut it. I believe Virtuoso has a lot more legs under itself and I think it's been very discouraging to other companies trying to come up with a replacement product for the Cadence solution. My hat is off to Cadence for making that happen.
I think Virtuoso is in very good shape these days. I was Virtuoso R&D chief and marketing chief for a long time, so it warms my heart that Virtuoso is still dominant and not standing still. And for Spectre [simulation], there's been a huge surge in capability as well. From a technical point of view, I think the analog/mixed-signal portion of Cadence is in as good a shape as it's ever been in. Awesome job, everyone!
A nice resume - albeit rather Cadence-centred; but thank you anyway.
One potential problem with "parasitic aware" design is when the device modellers take it as relieving them of the need to include the parasitic calculations in the explicit device modelling.
The problem here is transparency, and there are two potential negatives.
If approximate parasitic calculations are included in the basic device model the designer can initially optimise semi-analytically; in addition individual designers can check the calculation algorithm for sanity (and some of us do - albeit somewhat randomly).
If on the other hand the substrate resistance model in the parasitic extraction is incorrectly chosen the designer is unlikely to discover the error until samples don't work as expected - and even then it can be extremely difficult to identify the cause. [Sorry to "bang on" about this - but I've just seen another example of just this problem.]