Get email delivery of the Cadence blog featured here
I sat down with Simon Segars, the CEO of ARM last Friday. As I said yesterday, it is ARM's 25th birthday this week, on Friday if you want the precise date. Although today, of course, we think of even the largest ARM processors as something to embed in an SoC, when the first ARMs were created they were standalone processors taking up the whole die. The ARM1 and ARM2 were just processors. With the ARM3, there was room for a cache. By the time ARM was spun out of Acorn, they were working on the ARM6, a processor and cache, intended for the Apple Newton. Unfortunately the Newton was way ahead of its time and was not a commercial success.
The turning point for ARM was the ARM7. Actually the ARM7TDMI. What do all those letters mean? D was for debug, allowing for JTAG-based debugging. M was for multiplier, it had a fast hardware multiplier. I was for Icebreaker, a sort of on-chip in-circuit-emulator that allowed for hardware breakpoints and watchpoints. But the most important letter was the T that stood for Thumb.
All ARM processors were 32 bit, but that meant that all instruction fetches were 32 bit and so every instruction took up 32 bits. This meant that the code size was large. A joint project was kicked off between Nokia, Texas Instruments, and ARM to address this. Nokia had decided that it wanted to use a 32-bit processor in future phones and TI was their semiconductor supplier. This was an aggressive decision at the time, since most cellphones used 8-bit processors. For example, Nokia's big Scandinavian competitor Ericsson used the Z80. But Nokia reckoned that by moving more of the logic into the processor they would improve overall system efficiency. But the code density was a big issue.
This motivated the creation of Thumb. This was a mode in which the processor could execute a limited set of 16-bit instructions. The code density was much higher and so the system required less memory and was smaller. This was despite the fact that the ARM7TDMI processor itself was bigger than a pure ARM7 since the it required the Thumb instruction decode in addition to the normal instruction decode.
The lead designer for the ARM7TDMI and its eventual project manager was...Simon Segars. He wanted to show me the original ARM7TDMI testchip which normally lives in his office, but it is currently on loan to a museum in Cambridge for a 25 years of ARM exhibition. The project had about 10 people on the hardware design, and another group working on software. The Thumb instruction set required assembler, compiler, and debugger support so hardware and software were being developed at the same time. The ARM7TDMI came out in 1995 (for some reason Wikipedia says 1998) and a synthesizable version (per the ARM website) in 1998 (Wikipedia says 2001). My memory is that ARM's history is correct and Wikipedia is wrong (the internet is wrong!).
Simon thinks that over 30 billion ARM7TDMI chips have shipped, making it the biggest selling microprocessor of all time, at least in terms of unit volume. It was also the turning point for ARM as a company. The ARM7TDMI became the "standard" microprocessor to put in cellphones, not just at Nokia but everyone else. Even Ericsson switched from Z80 to ARM. VLSI Technology's GSM chipsets were ARM7TDMI from the beginning. The cellphone industry entered its period of high growth and ARM found themselves sitting on top of the rocket.
But it was not just cellphones. The semiconductor companies who licensed the ARM7TDMI and then were not successful in mobile found ways to use it in other industries. Increasingly, the ARM7TDMI was the standard microprocessor for any semiconductor company that didn't already have its own internal microprocessor design team, which was most of them. Then, gradually, over time, ARM became the standard microprocessor for almost everything except Intel-architecture PCs and servers.
Of course, ARM has its eye on those servers, too. Simon told me that the end markets clearly want an alternative, and one that will be around for a long time. ARM has been investing in making building servers both easier and more standard, in particular with the server-based system architecture (SBSA). He also told ARM's CIO to get arm.com running on ARM servers and find out what management software and other infrastructure is missing. Lots of semiconductor companies such as Cavium and Qualcomm have ARM-based server products available and there is a lot of evaluation going on. One area where ARM actually has a potential advantage is China, which is increasingly wanting to favor local suppliers. Did you know that the value of China's import of semiconductors is bigger than the value of China's import of oil? Yes, a lot of them get re-exported, unlike the oil, but it is still an extraordinary statistic. The growth in servers is not just driven by people like Google and Facebook but by mobile and internet of things (IoT), which have requirements for devices, cloud back ends, and new network architectures with less latency. All of us are going to generate a lot more data than we do today and this will all need to be moved around and processed.
ARM started with 12 people. It is now about 4000 people. Although it clearly has a British heritage and is in some sense a British company, only 1500 of those 4000 work in the UK (Simon is not one of them, he lives in the US, although I think he pretty much lives on planes). The rest are all over the world. It has been quite an eventful 25 years.