• Skip to main content
  • Skip to search
  • Skip to footer
Cadence Home
  • This search text may be transcribed, used, stored, or accessed by our third-party service providers per our Cookie Policy and Privacy Policy.

  1. Blogs
  2. Breakfast Bytes
  3. 50 Years of the Microprocessor, Part 1
Paul McLellan
Paul McLellan

Community Member

Blog Activity
Options
  • Subscribe by email
  • More
  • Cancel
isca
moore's law
microprocessor

50 Years of the Microprocessor, Part 1

13 Jul 2021 • 9 minute read

 breakfast bytes logoAt the recent International Symposium on Computer Architecture (ISCA) there was a special session to celebrate the 50th anniversary of the microprocessor, with eight experts who had influence during those decades. They were:

  • Federico Faggin, designer of the first microprocessor the 4004, and the founder of Zilog.
  • John Hennessy, founder of MIPS, won the Turing award for work on RISC (with Patterson), currently chairman of Alphabet.
  • Dave Patterson, Berkeley RISC, won the Turing award for work on RISC, currently at Google.
  • Glen Henry, IBM veteran, from IBM mainframes to PCs, and now Centaur Technologies.
  • Kathy Papermaster, IBM designer for 26 years, head of IBM Cell used in Playstation 3.
  • Lee Smith, a founder of Arm, joined forerunner Acorn in 1983.
  • Shekhar Borkhar, microprocessor research for 34 years at Intel, now at Qualcomm.
  • Chris Rowen, co-founder of MIPS, founder of Tensilica, founder of Babblelabs, currently at Cisco.
  • Scott Gardner (the moderator) at iDT for 10 years, then Intrinsity (acquired by Apple).

I am going to divide this up into three posts: two on what the panelists said during the event, and one which is a combination of their conclusions and my own.

Lizy John, the ISCA program chair, welcomed everyone and then passed the baton to Scott to moderate the next ninety minutes.

Scott started with an ad for the Intel 4004, the first microprocessor, from November 1971. More of the history of that is coming up.

Scott also had some memorabilia from Microprocessor Forum on the 25th Anniversary of the microprocessor. The display case contained actual chips. You can see the statistics for the 4004:

  • 14.7mm2
  • 2,300 transistors
  • 10u PMOS
  • 1-layer metal
  • Released (in the general availability version) in November 1971, although the proprietary version was earlier in the year.

Scott also talked a bit about how iDT was involved in manufacturing the first MIPS processors. Here's a trivia fact-of-the-day: Cadence's building 11 (across the road) used to be an iDT fab. Twenty years ago, when I ran Custom IC, we were all in that building. When Cadence acquired the fab, apparently we had to put out a press release clarifying that we were just acquiring it for office-space and that we were not entering the foundry business.

Scott started by asking each panelist to spend a couple of minutes to introduce themselves and also talk about inflection points, whether good or bad, over the last 50 years. In fact, he had a more detailed agenda:

Before asking Federico to kick things off, Scott gave a plug for his recently published autobiography, Silicon.

Having just written his autobiography, Federico was on a roll when he was asked to give his story to open the discussion.

Federico Faggin

The story starts in 1968 at Fairchild. All ICs used bipolar in that era. There was an up-and-coming technology called MOS but it still had many problems. However, you could put a lot more transistors in a single chip. But it was extremely slow and unreliable. At end of 1968, we had developed the Fairchild 3708. It was was polychristalline self-aligned gates, transistors that were five times faster than metal gate, and you could put many transistors on the same chip since could connect directly gate to junction without metal. Finally you could put an entire CPU on a single chip. In those days, there was very poor driving characteristics so you couldn’t build multi-chip systems. It would just not be that useful. Microprocessors are only useful if they are fast enough and you can create lots of different products with a single device.

Then at Intel they had silicon gate technology and a customer, Busicom, who had developed an architecture with three chips to make a CPU. It used shift-register memories, which was all that was available. Clearly, you need RAM for a computer. Intel was developing DRAM which is possible with silicon gate and its very low leakage current. We improved architecture and created a four-chip solution: ROM, RAM, I/O, CPU. CPU was way beyond what anyone had designed before and I was hired by Intel to figure out how to do it. I had all the pieces and that chip worked March 1971 and was sold at Busicom that month. I pushed Intel management to buy out the exclusivity contract with Busicom and actually sell that chip. That was the ad you saw earlier. November 1971. Quickly, I also led the design of the 8008, the first 8-bit microprocessor, and then the 8080 (six times faster than 8008) which really kicked the microprocessor into gear. Intel was still a memory company that only sold microprocessors to sell memories.

That was unacceptable to me, so I left and founded Zilog as a pure microprocessor company. The Z80 was a very powerful device, the first to use pipelining in the 4-bit ALU doing 16-bit operations. But that was a CISC architecture. That, along with the 8080 and the 6502, really started the microprocessor market that created a business that drove microprocessors for next 50 or 40 years.

John Hennessy

I arrived at Stanford and first course I taught was a microprocessor lab course using 8080s and later Zilog Z80s. My background was in compilers, and I was doing compiler research. I realized we could compile down to a a much simpler instruction set and not need microcode. And that was how we came to the RISC ideas. We could see Moore’s law would allow us to do stuff like virtual memory and caches. But nobody foresaw how the microprocessor would take over the whole industry. It happened so fast, basically by 1990. It was a tribute to progress of semiconductor technology.

Dave Patterson

I worked on microprogramming tools, and RISC is an ISA that doesn't need microcode. There's no need for an interpreter in the hardware. We believed in Moore’s law and so believed that everything in a mainframe would eventually be in a microprocessor. Minicomputers had just gone fro 16-bit to 32-bit. We realized it made no sense to build a 32-bit microcode engine. People forget that at the time it was extraordinarily controversial. People thought we would damage the computer industry by not building a high-level instruction set with microcode. There were a lot of people who thought we had dangerous ideas and tried to suppress them.

Chris Rowen

I worked with Chris for years since it was a SemiWiki client and the account was assigned to me. Then Cadence acquired Tensilica so Chris was here when I rejoined Cadence in 2015. Even after he left, he still consulted to Cadence so was still around enough for me to use him as a resource for Breakfast Bytes.

I came originally from a physics background, went to Intel in 1977, and gradually moved to microprocessors and applications. The most important inflection point in the microprocessor was probably not the ISA but the great divide starting about 25 years, between the mainstream, which is mostly about raising level of abstraction, higher level languages, libraries, operating systems, cloud-based services. Really, it is about how to make it really easy to develop applications, and how efficient it is second order. The others side is domain specific, led by graphics, AI, real-time-media processing. You care a lot about how efficiently you use energy and silicon real estate. Standard architectures are about high development efficiency, and domain specific is about executing efficiently.

The way you do adding in the real world: “Hey Siri, what is 23 + 46?” and if you work out how many cycles that takes it is bazillions. My career has followed that from working in silicon, to RISC, to domain-specific stuff and Tensilica. There the ISA doesn’t matter that much and several thousand ISA supersets of the basic thing were created. Whereas the mainstream dominated by x86 and Arm.

Kathy Papermaster

I joined IBM in 1983 in Burlington. I joined as a circuit designer. Most of the bipolar work was being done in East Fishkill, which was a cash cow for IBM. In Burlington just had just switched from NMOS to CMOS. We decided it was too cold and came to Austin. Started with Motorola, Apple, IBM design center named Somerset. That was the 1990s. I loved the team. After that, after Somerset, PowerPC concluded, PlayStation 3 launched. I took trips to Japan to talk to Sony, Toshiba was involved, and finally I was the director of Cell architecture. I've been retired last 10 years.

Glen Henry

I’ve been doing this a long time. I started as an application programmer in 1963. Joined IBM in 1967. IBM was already shipping computers with virtual memory, virtual machine emulation, caches, multiprocessing, out-of-order execution, common API. It is the silicon technology that is the differentiator for a microprocessor. The architecture has evolved naturally following mainframes. One system I want to mention, one project was System 38 that had object-oriented instruction set, high-level machine stuff, single 48-bit level store, a built-in relational database. This was the 1970s. If I look at long-term effects, they are two fold. First, silicon gate technology to allow size and power to be reduced. The other implication of what Federico did was the 8080 that drove the microprocessor. And there is a direct line between that and what we do at Centaur. We have been doing x86 processors for 26 years (IBM, Dell, HP, Lenovo, Samsung are customers, so it is a real company).

Lee Smith

I was at Edinburgh with Lee. In fact, when he mentions working on CAD, that was a design system that he and I created based around a language I came up with called ESDL (Edinburgh Structural Description Language). It was basically at the gate-level netlist doing board-level design. We hadn't yet become Mead & Conway babies and started to learn how chips were designed.

I think RISC and CISC converged over the years and I’m not sure it is a real war anymore. I started in high school writing my first programs for the IBM360 in Fortran. I first came across microprocessors at Edinburgh and we were looking to build a “3M” machine, 1 MIPS, 1 megabyte, 1 megapixel. We surveyed the market and found most of the chipsets of the era wanting in one way or another. Eventually, we built a machine based on the 68000, I ported the portable C compiler to it via Vax/VMS. At the end, I was pretty impressed that this thing could hold its own compiling C-code versus the department's shared Vax. So that was how I believed that microprocessors were going to become real. I moved to Acorn in 1983 and they presented me with a BBC machine with a 6502 inside. I couldn’t take this thing seriously. My history was that I did computer-aided design tools and then I did compilers. The reality is that 16-bit is just not enough code space for any of those things. You can use overlays, I’ve used overlay managers, I’ve even implemented one. It’s completely miserable. For me, the real world of computing on microprocessors only began when we got 32-bit micros. I was at Acorn when Acorn produced the Arm 1. So I was seven years at Acorn and then 30-something at Arm.

Shekhar Borkar

My history is not as rich. I'm a physicist and started grad school in the '70s and '80s. Then I joined Intel to work on the 8051. Today that is a basic microcontroller but in those days the 8051 was "powerful". I then switched to high-performance computing, followed by microprocessor research. I retired from Intel and I am now at Qualcomm.

Part 2

Shekhar had slides that he presented, too. But since these really kicked off the discussion part of the panel, I'll leave that to a second part since this post is getting very long.

UPDATE: here's a link for part 2.

 

Sign up for Sunday Brunch, the weekly Breakfast Bytes email.