Get email delivery of the Cadence blog featured here
I am at embedded world in Nuremberg. One thing that is not here is warm weather. All that I can say is that it is warmer in Nuremberg than Munich, -7ºC instead of -11ºC. People don't remember it being this cold for many years (although I remember visiting Munich when it was even colder when I lived in France, but that was in the early 90s.).
Getting into the show, given the size, was remarkably easy. I was pre-registered, so I had a barcode. But unlike other shows I've been to, where you have to stand in line to get your real badge, you just use the barcode to get in. I didn't know this, so I stood in line at the press desk anyway, and they told me that I already had everything. I wondered how come most people had a badge, not just a sheet of paper with a barcode, but I scanned my barcode at the turnstile. Instead of admitting me immediately, the turnstile printed my badge and then admitted me. You've probably heard the old joke that heaven is where the police are British, the chefs Italian, and the mechanics are German. The German turnstile was certainly living up to reputation.
One problem with conferences about embedded systems, including this one, is that the name "embedded" covers a huge range of different types of design. Someone programming an Arduino or Raspberry Pi board are doing embedded. So are teams at Bosch, Tesla, or Waymo (fka Google) writing software for autonomous vehicles, or big programming teams at Cisco or Samsung writing software for networks and mobile. At the high end, the individual engineers probably don't consider themselves to be in "embedded". Like the phrase "system-level", which seems to mean any level above the level at which you work, embedded seems to mean any programming level below that at which you work.
For example, one end of embedded is clearly programming microcontrollers, although even microcontrollers are often (mostly?) 32-bit now. In Mark Papermaster's opening keynote, in his small "AMD commercial" bit at the end, he talked about their Epyc Embedded series: 16 cores, up to 32MB cache, >3GHz clock, 64 lanes of PCIe, and so on. All on a single integrated SoC. That's pretty much a mainframe on a chip, but it goes under the embedded name.
Cadence has three main thrusts in embedded, using the word in the widest sense:
The biggest segment in semiconductor is mobile. It has had explosive growth in the ten years since Steve Jobs stepped onto the stage at the YBCA in San Francisco and made the first iPhone call to Hannah Zhang, working in the nearby Starbucks (who was a bit surprised by his order for 4,000 lattes to go). However, over the last few years, the growth rate has moderated from 30+% per year down to single digits. Depending on just which numbers you trust, it looks as if mobile is now started to decline in unit shipments. Most people who want a smartphone have one, and the new models don't offer a compelling reason to upgrade immediately the new generation is announced. I still use my iPhone 6s, and haven't felt the need to get an iPhone 7 or 8, let along shell out over $1,000 for an iPhone X.
The numbers for automotive semiconductor are still modest but the growth rate is already high, and the forecasts for the future are higher still. Until recently, automotive was a sort of sleepy backwater of semiconductor, with chips designed in processes that had been around for a decade, by specialist groups that understood a lot about reliability. That has all changed. Autonomous driving requires high-performance semiconductor products for vision processing and neural network inference. The performance required cannot be delivered in ten-year-old processes, and that type of design cannot be done by the teams who created the old automotive chips. On the other hand, the design teams that can do big chips in the most leading-edge processes don't know much about reliability, and the foundries have not had to characterize leading-edge processes for automotive until recently. This has meant a crash course for everyone on what it is going to take to build high-performance high-reliability 16/14nm and 7/10nm chips.
It is an exaggeration to say that embedded world is all about automotive, but not by much. Everyone, including Cadence, is positioning their products to appeal to the exploding requirements of the new automotive market, combining advanced silicon designs, with reliability requirements, and an estimated 100M lines of code in a vehicle. A lot of transistors, a lot of lines of code, and ISO 26262 to keep everything honest from a reliability point of view.
One big difference at embedded world this year is that we are not relying on FPGA implementations of Tensilica processors to show off their capabilities. Dream Chip is a German company who have designed a very capable automotive SoC that was manufactured in GLOBALFOUNDRIES' 22FDX process (in Dresden...this is an all-German product). This SoC contains four Tensilica Vision P6 DSPs. You can read more about the Dream Chip design in my post last year Dream Chip: A Vision for Your Car. It contains a lot more automotive capability than we are demonstrating. The picture above shows the chip on its board. Well, you can't really see the chip, it's hidden underneath the blue heatsink.
In past years, we have shown some vision recognition neural nets running on an FPGA implementation of the Tensilica Vision DSP. This year, we have a similar demo, but now running on the SoC. We do have an FPGA implementation, too, but instead of running on an FPGA development board, it is running on a Protium FPGA prototyping system. Historically, the problem with FPGA prototyping has been the difficulty of getting the design into the system. The Protium system is not completely turnkey, but it is very close to being able to take any design that runs in Xcelium simulation and run it with just a compile (and in the Palladium platform, too, which shares most of the compiler code). So you can see software running on the Protium system as a prototype, and running on an SoC as quasi-production.
I think the most impressive demo is the four-camera car. You can see the car above. It has four cameras pointing ahead, to the rear, and to each side. One of the challenges in automotive is to take input from multiple sources and stitch them together to get an overall view of the vehicle's environment. This is often called sensor fusion. This demo is taking input from all four cameras and melding them together to arrive at an image of how things would look from a camera up above the car, even though there isn't one there. In the above picture, you can see the image on the screen. It clearly shows the train. In a real ADAS system, in addition to merging data streams from multiple cameras, there would be additional data from radar and lidar. Tesla is the possible exception to this, who have always maintained that lidar is not required. But Tesla cars with manufacturer's plates have been spotted in Palo Alto recently with lidar, so perhaps Tesla is just avoiding Osborning themselves by not pre-announcing any lidar solution until they have it.
Cadence is most well-known for its suite of IC design tools. But embedded world is not really a place to showcase semiconductor design. One area where the high-powered tools and automotive come together is compliance testing of in-car networks. A car is a very noisy environment. The historic networks such as FlexRay and CAN bus do not have enough performance and flexibility going forward, and the future standard looks like being the automotive version of Ethernet. But this requires good signal integrity analysis for compliance testing for EMI, thermal, noise, and more. Cadence's Sigrity technology is the primary tool for analyzing signal and power integrity from chips, to packages, to boards...to cars.
More about embedded world next week.
Sign up for Sunday Brunch, the weekly Breakfast Bytes email.