Get email delivery of the Cadence blog featured here
Tuesday, the second day of DAC. Last night I learned that in Texas there is a third thing that you don't discuss, to go along with politics and religion: barbecue. After the show closed, we went to the Iron Works barbecue, which is almost around the back of the conference center. It used to be a real ironworks. It is notable, too, for having a line drawn on the wall about 6 feet up that shows the level the floodwaters reached on June 5, 1935.
We discussed barbecue, because we're not from Texas. The lines for Franklin barbecue, anointed best barbecue in the nation by Gourmet magazine start before 7am and they don't open until 11. And it's hot in Austin. Locals say the Salt Lick is good, except for the one in the airport. Iron Works seems to come high on the list though, and is just a five minute walk from DAC.
The day opened with Lou Scheffer, who used to be a Cadence fellow in physical design before "making a mistake and leaving EDA" as Chuck Alpert, the DAC chair said in his introduction. Lou talked about how far electronics has to go to catch up with biology. We need multi-billion fabs with no dust whereas biology works fine in the dirt. It works much better, too. For example, a worm with just 300 neurons can learn not to over-react to its tank being tapped from just 10 examples.
So it is a hot topic to do what Lou and the people he works with are doing, to study the brain to discover how it works so it can be copied. Of course there are lots of issues with the brain such as depression or autism, so there are good reasons to study it for its own sake. A fruit fly can learn to respond to a pleasant or unpleasant odor with 1100 neurons and 220K synapses. Lou says that we know the basic structure (see picture). At current rates, in a few years we should have the whole fly mapped.
Neural networks have gotten hot in the last few years, for AI and recognition tasks,in particular. The field has been advancing fast. As Lou pointed out, AlphaGo beating the world champion is so efficient that it could have been run on 1985-era hardware if we had known how to do it back then. There is much more to learn. Biological networks are very different. We don’t understand how they work or how they are trained, but probably that is how we get such big differences in image recognition capability, where biological brains are so much better than silicon ones.
Lou finished by pushing back on Chuck's jibe in his introduction. He pointed out that 8 out of 10 people in his group have CS or engineering backgrounds, not biological. We are in the golden age of neuroscience. Lou finished by trying to recruit us all. "If you want to improve electronics, don't study electronics. You should study the brain."
Sameer talked about Driving the Next Decade of Innovations in Visual and Accelerated Computing. Since one of the uses to which GPUs are increasingly being used is neural networks, there turned out to be a lot more overlap than you'd expect with Lou's talk.
Of course, the early days of NVIDIA (and GPUs in general) was to improve realism in gaming. You can see how far we have come by looking at the above pictures of Kobe Bryant from a game today, compared to 20 years ago in 1996. In fact, it is not just the visual representation that has improved, the physics of the game—such as how the ball bounces—is also much better. It is now hard to tell at a glance if a game is real or being rendered. The next step is virtual reality, but that requires still more powerful GPUs since you are trying to be so flawless as to fool the brain that you are "in the real world". In fact, it requires about 7X the graphics performance.
But about ten years ago, NVIDIA also started to focus on using GPUs as highly-parallel compute engines (for things other than graphics). For example, the most powerful supercomputer today is at Oak Ridge National Labs and has 18,000 NVIDIA GPUs.
As I already said, another area of interest is using GPUs to accelerate deep neural networks. There has been a huge change in this area in just five or six years. Until about 2010, vision was done using traditional programming but that seemed to top out at a success rate of about 70%. Since then, with the availability of massive amounts of training data, there has been a switch to neural networks, which are trained rather than programmed, and the success rate is almost 100%.
One area where vision and recognition is extremely important is self-driving cars. It has been an amazing year with traditional car companies acquiring technology companies, real tests on real roads, and so on. One statistic Sameer quoted: Tesla have found that in Autopilot mode (basically self-driving) airbag deployment accidents are reduced by 50%.
He did also talk a little about EDA. This is DAC after all. There have been huge improvements in the back end but fewer in the front end of design. He is optimistic that some of the neural network approaches might turn out to be really good at finding bugs in RTL before the first vector of simulation is run, more in the manner that a human might spot a problem just by reading the code. There are also possibilities to speed up simulation using GPUs. A big remaining problem is lack of correlation between the earliest stages of design and the latest, so that surprises still appear late in the design (such as excessive power consumption) that have to be fixed back in the front end since the analysis tools in the front end missed the problem. I'm not entirely sure this is a solvable problem, since it is not really possible to get 100% accurate analysis of, say, wire length, without doing the routing and thus effectively pulling the back end into the front end.
Sameer finished with an acknowledgement that Moore's Law is slowing. But all is not lost. The airline industry had tremendous growth for years and then it matured. There was a doubling of passenger miles every three years from 1915 until the Boeing 747. But Boeing continued to innovate: materials, passenger experience. aeronautics. Their value as a corporation increased 100 fold from 1970 to today. So even as the pace of semiconductor nodes slows, there is so much innovation that can still be done.
RISC-V (pronounced RISC-five as in the Roman numeral, not RISC-Vee as it was referred to in the preamble to this morning's keynote) is an open reduced instruction set architecture (ISA). It was created at UC Berkeley but has acquired a lot of heavyweight industry involvement, too. The big question, to me, is whether it is going to turn out to be like Linux, laughed at as a toy and now the operating system that runs almost all datacenters and the cloud, and if you count the Android derivative, most smartphones, too.
At the pavilion, Krste Asanovic (of UC Berkeley) gave a Sky talk RISC-V: Instruction Sets Want to Be Free.
He started off asking how important ISAs are. Just as Intel asks why they can't get into mobile? Or ARM asks why they struggle to get into datacenters? Or how IBM can still sell mainframes over 50 years after defining the 360 ISA? It is interesting that most datacenters run on the AMD 64-bit x86 ISA. Why is Intel building AMD architecture server chips? Because their attempt with Itanium to define a new ISA didn't work. What if there was a free ISA anyone could use? That's where RISC-V comes in.
Krste (yes, that really is a name with no vowels) gave the history. In 2010, he was wondering what ISA they should use for their next research program. Obviously ARM or x86, which dominate the landscape. But x86 is impossible (complexity, licensing issues), and ARM is almost impossible. So he started a three-month project for a clean-slate ISA and defined RISC-V. They used it for a few years for teaching. But a funny thing happened. They started to get random emails from people all around the world complaining when the made changes to the ISA. There was such a hunger for a clean ISA free of IP issues that people just started using it.
So in 2014, they froze the base specs of RISC-V. In 2015, they created the RISC-V foundation. You must be a member to use the RISC-V trademark (and all implementations must pass the compatibility suite to prevent fragmentation).
Who are members? Lots of companies including IBM, Google, NVIDIA, Oracle, the encryption part of Rambus, HP Labs.
Momentum seems to be building up and it is going mainstream. Google told the last RISC-V conference a few months ago that their source code management system will not commit a change to Coreboot if RISC-V is broken (don't worry if you don't quite know what that means, it means that Google is committed to RISC-V not just taking a wait-and-see attitude).
Krste finished by saying that now their "modest" goal is "to become the industry-standard ISA for all computing devices."
I will write a more extensive post covering more of the details, plus some of the later session about RISC-V after Krste had run upstairs to present details of the software ecosystem.
EMC2 pale-blue socks. I can't see myself getting a lot of use out of these. I've been dared to wear them to the Denali party this evening. It might be the only occasion I can use them! Since local Austin company Dell is acquiring EMC2 then perhaps they will be come collectors items.
Next: DAC News, Wednesday
Previous: DAC News, Monday