Get email delivery of the Cadence blog featured here
It doesn't seem to matter what a show is ostensibly about at the moment. Every show seems to be talk about quantum computing, 5G, or artificial intelligence (AI). DesignCon, the focus of which is PCB design, was no exception, with three keynotes on...quantum computing, 5G, and AI. They weren't quite as off-topic as I managed to make them sound. The keynote about quantum computing was focused on how you get data in and out of the quantum computer (answer: microwaves). The 5G keynote was about connected cars. And the AI one was more about Uber's datacenters and how they are building compute infrastructure at scale, and so was also about connected cars.
Robert Heat of UT Austin talked about using 5G for V2X communication. V2X means vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I). As I've said before, I'm a little skeptical of just how important 5G is to automotive. But, as Robert pointed out, it is very attractive to 5G suppliers. "You are a carrier, everyone has a phone, who do you sell to? Can we sell phones to the cars?" Vehicles are becoming smarter, but sensors (cameras, lidar, radar) are limited by line-of-sight since we can't see around a truck, let alone a corner.
There is scope to use V2X to improve this. Examples Robert had were a truck or an oncoming car communicating whether or not it is safe to pass, or even that a truck could broadcast a video of what it could see (although this requires high data rates). One automotive standard that has been around for a long time is DSRC. This stands for Dedicated Short Range Communication. Although it has been around for a long time, it has not been widely deployed. Robert was told by one car manufacturer that "we're waiting for everyone else to put it in and then we'll put it in." In fact, it has been so narrowly deployed that now governments want the spectrum back! In fact, there is no need to duplicate 5G with DSRC, so I suspect that DSRC will fade away.
One school of thought about autonomous driving is that it will be completely on-vehicle. I think that some communication with infrastructure is likely to gradually happen, such as traffic lights signaling when they will change, allowing vehicles to optimize transiting the junction. On thing I hadn't thought of is that cellular basestations could also broadcast useful information. After all, they have two things going for them: they already have antennas and transmitters, and they tend to be high, with a much better view of the road than you get from the vehicle itself. There's a reason airport control towers are towers—so that they can see the planes on the ground (which have worse visibility than cars, with no rear window or mirrors).
The next day it was Uber's turn. Gloria Lau, who is head of hardware engineering, talked about engineering their datacenters. First, she talked about Uber Elevate, which is their helicopter-based flying craft. Most of us watched the Jetsons as kids, and finally flying cars are going to be coming true. As I wrote about when I covered CES (see my post Consumer Electronics: 5G, AI, and Air Taxis), this is based on electrically powered helicopters with six rotors/engines (so naturally much more fail-safe than a normal helicopter). Gloria had some statistics on Uber's growth (in the non-flying vehicle category): it is in 65 countries, 600 cities, hit 10B rides last year, 3M active drivers, 75M active riders, 15M rides per day (which seems very high, one in five riders uses Uber every day).
Uber uses a lot of AI, which requires a lot of computer power in the datacenters. AI is used for:
The exhibits go from small companies that you've never heard of with 10x10 feet booths, to big companies like Cadence and Keysight. At GOMAC a couple of years ago, I met a guy from Keysight who'd not dared stop to eat on the drive to Reno since he had some equipment in the back of his SUV for characterizing satellite links that sold for over a million dollars. I'm not sure how pricey some of the equipment they had on display at DesignCon was, but in six figures, for sure. Oscilloscopes are a strange product line, with low-end ones selling for a few hundred dollars, and the high-end ones for a quarter of a million.
On the first morning, Cadence ran a half-day tutorial on using AMI and IBIS for modeling SerDes and DRAM. For details, see my post DesignCon: Cadence Teaches AMI and IBIS.
Also on the first morning, Cadence ran a tutorial on Silicon Photonics, along with Lumerical and TowerJazz. Since that ran in parallel with the AMI/IBIS tutorial, I couldn't be in two places at once. But the following day, there was a panel session titled Photonics Coming of Age, the Emergence of PDKs. It was moderated by James Pond, Lumerical's CTO, with a panel consisting of TowerJazz, HP Enterprise, Mentor, Smart Photonics, and Cadence's Gilles Lamant. I'll cover that panel session in a future post.
Cadence's booth was focused on signal integrity. The position that got the most attention was the 112G SerDes that was running, showing the PAM4 eye-diagram (with 3 eyes per bit). Signal integrity at those speeds is clearly a topic of great interest. On Thursday, when Cadence had a whole day of sponsored sessions, the presentation on Modeling and Simulating 112G SerDes was standing-room only. I'll cover that in a future post.
Sign up for Sunday Brunch, the weekly Breakfast Bytes email.