• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Blogs
  2. Breakfast Bytes
  3. Automotive Summit: The Road to an Autonomous Future
Paul McLellan
Paul McLellan

Community Member

Blog Activity
Options
  • Subscribe by email
  • More
  • Cancel
Automotive
functional safety
lidar
radar
camera
ISO 26262

Automotive Summit: The Road to an Autonomous Future

13 Dec 2018 • 6 minute read

 breakfast bytes logoBefore Thanksgiving, Cadence held an Automotive Summit. I was going to dive into some of the detailed material presented, but it occurred to me that it might be a good time to step back and take a look at where we are in the industry as a whole. I've written a lot about automotive. In fact, in the second week of Breakfast Bytes I wrote my first: Ten Years Ago Self-Driving Cars Couldn't Go Ten Miles. It was actually 11 years (but that didn't make a catchy headline) in 2004, so now 14 years ago, that DARPA had the first "Grand Challenge" for driverless cars over a 120-mile route in the Mojave desert. But the furthest any vehicle got was 8 miles. A lot has happened since then.

I'll use Raja Tabet's opening presentation as an organizing theme, but drop in other stuff. He's Cadence EVP of Emerging Technologies, of which automotive is the main one.

Automotive Semiconductor

There are a demand-side reason and a supply-side reason for all the interest in automotive semiconductors. The demand side is that ADAS (automatic driving assistance systems) and eventually autonomous driving require a big change in the requirements for semiconductors. Historically, automotive semiconductors have been built in ten-year-old processes, with lots of characterization data, lots of analog, and small microcontrollers. In the future, the processing requirements mean that they will be built in advanced processes, such as 16nm and 7nm FinFET. The supply side is that the wind is that the overall semiconductor industry s growing at a CAGR of 5.2% whereas automotive is almost twice that, at 10.6%, and autonomous driving silicon over twice that again, at 23.6%. The means that the average semiconductor value per vehicle is going to double from about $250 today to $600 in 2022. It is expected to triple with the deployment of level autonomous vehicles.

This is leading to a shake-up in how the automotive industry is structured. I was at the Ludwigsburg conference earlier in the year and practically every company had a version of the above picture with just the company name changed. Historically the car companies (OEMs in automotive-speak) like Ford, Toyota and BMW would deal with major automotive suppliers (tier-1s in automotive-speak) like Bosch, Denso, and Continental. The tier-1s would deal with the semiconductor companies (the big ones in automotive being NXP, Infineon, Renasas, STM, TI). In particular, the OEMs didn't really have relationships with the semiconductor companies, and they didn't really understand semiconductor design (nor even create their own software).

That is all changing. The OEMs realize that they need to be in control of at least some of their own silicon and a lot of their own software, otherwise their cars will be just like everyone else's. It reminds me of the transition in mobile where the market leaders, starting with Apple, all realized that they needed their own application processors at least. The tier-1s also needed deeper semiconductor relationships or they would be bypassed.

I talked about the demand-side (automotive) and supply-side (semiconductor) above, showing both sides need each other. But it goes deeper. The automotive companies have no experience of designing high-performance SoCs in advanced processes. But the semiconductor industry has no experience of trying to deliver automotive reliability in high-performance SoCs. It has been a learning exercise on both sides.

Big change like this always attracts new entrants. There is a lot of VC investment in new players, and a race to create working solutions. Meanwhile, as the diagram above showed, there is vertical integration with both OEMs and tier-1s seeking their own SoCs, while companies like NVIDIA and Intel/Mobileye provide standard solutions.

Sensors

A key part of autonomous driving is making the vehicle "see". There are three main technologies for this:

  • Vision: cameras of the type developed for cell-phones with (typically) CMOS image sensors.
  • Radar: this has the big advantage that it can "see" in the dark, in rain, fog, snow, and other conditions where vision is poor.
  • Lidar: this uses laser pulses to build up a picture of the vehicle's environment. Those are the big spinning things on the roof of first generation test vehicles, but for mass deployment they need to be solid state.

Sensor fusion sounds like a type of music, maybe next-generation techno, but in fact it refers to combining the information from all those sensors to get a single picture of what is going on around the vehicle, as a first step for deciding what to do next (keep driving, turn a corner, brake, and so on). The second part of the summit was entirely about sensors, and I will do a special post tomorrow with a lot more detail, so that will do for now.

Communication

Automotive vehicles require communication. At the very least, mapping and road condition data needs to be uploaded from the cloud to the vehicle. That requires a network connection, and it is seen as one of the drivers for 5G since it has high bandwidth and low latency. But vehicles need to operate even if the connection fails, or is congested: you can't go to the cloud and back to decide if a traffic light is red.

But there are potentially other forms of communication, which are known as V-2-X. The X is usually either another V for V-2-V (vehicle to vehicle), where vehicles can potentially signal to other cars what they are doing, or that they are approaching a blind intersection or whatever. The other choice I for Infrastructure so V-2-I (vehicle to infrastructure), where things like traffic lights can communicate with vehicles to more intelligently decide when to change, or perhaps signal to vehicles what they will do soon.

I would say there are three modes of thought about communication (I leave it to you to draw analogies between how cars will work and how society is run):

  • Chinese: autonomous cars will be always connected, and a lot of coordination will be done in the cloud in a centralized manner. It helps that I have never once had no signal on my phone in China, even in subways and basement parking garages.
  • Self-contained: cars will contain all the sensors required to drive themselves, and the connectivity is only used for things like road condition updates that are not time-sensitive. Vehicles will not wait on the government to be a partner in getting to autonomy (except for legal and regulatory framework).
  • Self-contained and V-2-X: All the V-2-X stuff is driven by the traditional automotive industry and government bureaucracy, especially in Europe. But I have a feeling it might all just get overtaken by the first two. By the time traffic lights can communicate with vehicles (remember, there are about a million in the US alone) the vehicles will just read the light, and perhaps even do what I do and read the pedestrian crossing countdown too.

Levels

 I think I mentioned driving levels above without explaining what they are. The Soceity of Automotive Engineers (SAE) has defined a 5 level progression (6 if you count level 0, which is no automation at all). They are:

  • Level 0: manual control
  • Level 1: driver assistance
  • Level 2: partial automation (feet off)
  • Level 3: conditional automation (hands off)
  • Level 4: high automation (eyes off)
  • Level 5: full automation (mind off)

Most cars today are level 1. Cars with things like lane-following and automatic emergency braking (AEB) are level 2. The first real "self-driving" comes in at level 3. Full automation, level 5, where you can imagine a car that no longer needs a steering wheel, is extremely demanding and likely a long way in the future. I noticed on a fully automated rail system, the London Docklands Light Railway, that there is still a full control dashboard underneath a normally-locked panel—and trains are simple, they can only go or stop. Cars will have manual controls for extreme conditions and breakdowns for a long time, I think.

Read More: Breakfast Bytes on Automotive

These posts are just the most recent (all 2018):

  • Overview: Automobil Elektronik Kongress 2018
  • Cadence's Automotive Solutions:CDNDrive Automotive Solutions: the Front Wheels and Rear Wheels
  • China: Trends, Technologies, and Regulations in China's Auto Market
  • Safety: CDNDrive: ISO 26262...Chapter 11 and The Safest Train Is One that Never Leaves the Station

 

Sign up for Sunday Brunch, the weekly Breakfast Bytes email.