Get email delivery of the Cadence blog featured here
The opening keynote from embedded world in Nuremberg was by Mark Papermaster, who is the CTO of AMD. The second-day keynote was by Andrea Martin, who is CTO for that part of IBM. The first thing to point out is that AMD and IBM are not the first companies that spring to mind when you think of embedded. Increasingly, embedded seems to mean anything that does't run in the cloud or a server farm, but AMD and IBM are both big in the other part of the business.
Indeed, Mark opened by saying he was involved in the mainframe business, and then had periods with the PC and now mobile. Now data is with us at work and play. Mark asked the same question I did: why is the CTO of AMD at embedded world? He said it the start of another transition. Embedded technology is going to seamlessly weave into our lives. For most people, not the sort of people in the audience, that is where their lives meet technology. Of course, Mark, and everyone in the audience, live technology every day.
The joint is getting more invisible. iPhone X recognizes you. Networks can now detect anomalies, not just known malware, but something that doesn't look quite right. Amazon Go is a grocery store in Seattle where you just walk in, get your stuff, and walk out. It sees what you take and knows who you are and how to charge you.
The megatrends in embedded are mostly exponential. Data will grow 50X by 2025, AI and machine learning will affect everything. Data is exploding. For example, GE is planning to spend $1B to analyze data per year. Walmart has a million transactions per hour, generating 2PB of data per day. 100TB of data is uploaded every day to Facebook.
A big trend, significant for the embedded market, is the move of computation to the edge. This is being enabled by ease of application development using high-level languages, libraries, and frameworks—things like Caffe and Tensorflow. These "gaskets" can hide the complexity of the underlying neural network making the power of reasoning widely available. This makes it possible to bring stuff from the supercomputer labs to the masses. Think of smartphones. Mark (and I) had Palm Pilots and loved them. But it was a limited market (us geeks) with limited apps. Once iPhone and Android had toolchains, there was a meteoritc rise in the smartphone industry that we have today. Mark thinks that this will happen for embedded in general.
Which industries will drive it? His bets are:
The reason this is a big change is that stuff that today we can only run on the edge of the network in a datacenter with hundreds of watts of power, will move to the edge, to either 1W in an IoT device, or a bit more in what Cisco calls "the fog", the devices between the personal ones and the datacenter itself. Lower latency and enough power to do useful functions.
Mark started with medical and had some numbers. 300M diagnostic radiology images are taken in US, with 12M diagnostic errors in US every year. There are over 1M new cancer cases every year in the US. Improving this process has a huge impact with early detection, more accuracy, and more speed.
In the same way, there are huge gains in industrial areas: improved process automation, real-time inspection and quality control, safer working conditions. Mark brought up Ricardo Ribalda to show how computer vision is done for the most prosaic of things: potatoes. In manufacturing, a lot of vision is used for deciding if a part, like a screw, matches the prototype perfect screw. But for potatoes, there is no perfect model of a potato. But you have to screen out the ones with disease, the ones that are not potatoes (stones are more likely than the golf ball), and so on. But Ricardo has been doing this for over a decade. It used to require a rack of equipment, tens of DSPs—super-expensive. In 2005, they moved to FPGAs, but doing development in VHDL from a standing start was hard. In 2009, they started to explore different markets and so built a more modular system, with the computation inside the body of the camera. They moved to a generic platform so that they could easily address markets other than their original "custom potato counter."
The result is that it no longer takes four years, just a few weeks. They still work a lot in the food industry, since that's where their clients are, but it is broadening to other industries.
Of course, AMD is a processor vendor, so it is not a surprise that they are addressing this market with new chips. But this isn't your father's Oldsmobile, it's a mainframe on a chip, with up to 16 cores, 3GHz clock rates and huge I/O bandwidth. The new era of edge computing will require new silicon platforms. Mark emphasized that he thinks the future is open, Linux dominated, open-source tools...and the AMD machine learning stack is all open, too.
Sign up for Sunday Brunch, the weekly Breakfast Bytes email.