In the semiconductor industry, we've been dealing with the exponential growth associated with Moore's Law for over 50 years. Even so, I don't think that gives us an intuitive understanding of exponential growth. Nobody seems to have a feeling about it. Of course, we can run the calculations and work out how many transistors will be on a die in 2030 if that number continues to double about every couple of years.

This has been brought home to me recently. The big thing that people don't get is that the first part of an exponential is not scary at all, even though it is still an exponential. When we first started having multi-core processors, there were two or maybe four cores on a processor. In a keynote at EDPS back then, I tried to coin the phrase "Core's Law" that the number of cores on a processor was doubling every couple of years—it just wasn't that obvious at that point since it was still the flat part of the curve. Now that processors have 32 and 64 cores, it is a lot clearer.

A quick question to tell if anyone "gets it" is as follows:

A fast-growing weed doubles in size every 24 hours. A big pond is fully covered with weeds after 30 days. How many days did it take to be half-covered?

"Normal" people say things like 15 days, or get daring and go for 20 or even 25. The answer, if you are used to exponentials like Moore's Law, is clearly 29. The pond is half-covered after 29 days, then the weed doubles in size again and the pond is fully covered. But after 10 days, the pond would only have been one-millionth covered. To make that clearer still, if the pond is circular, 100 feet in diameter (that's a big pond), after 10 days the weed that is going to clog the pond completely 20 days later would have covered approximately *one square inch*.

# Linear Moore's Law

Moore's original paper showed this graph. Perhaps the most amazing thing about Moore's Law is that it is extrapolated from just five datapoints (or four if you don't count the single transistor in 1959). The last datapoint is 2^{6} which is 64 transistors in 1965. That took a year to double from 32 in 1964. Later, Gordon Moore revised the time to double to two years. An exponential like this can be looked at in two ways. One is the time to double, that Moore's Law predicted. But there are also the absolute numbers. From an epidemiologist's point of view, the doubling time of a virus might be important, but a country cares about how many people are dying, or perhaps how often someone dies.

Let's take a look at Moore's Law with that perspective. You can pick your own numbers, but I'm going to go with some GPUs and high-end CPUs. My two datapoints are that in 2019 a GPU/CPU had 24B transistors, and in 2017 two years earlier, it only had half that, 12B transistors. I picked these numbers partially so that it is easy to do the math mentally. There are 63M seconds in two years. So the number of transistors on a die increases by about 200 per second. What took a year in 1965, adding 32 transistors, now takes less than 200ms today. Of course, the real world doesn't actually work quite like that, new processes come along a node at a time and are not smooth. But that basic idea is correct: what took a year in the mid-1960s takes a fraction of a second today.

We can do some similar calculations about design productivity. If those 24B transistors were designed in two years, then the design team is designing one billion transistors per month, or 33M per day, or around 400 per second. These are ballpark numbers and ignore things like memory compilers versus complicated logic. But to put it all in perspective, I can remember when a 10K gate ASIC was the biggest design we could do at VLSI Technology in the early 1980s. That's 40K transistors or so. At today's productivity level, that's less than two minutes to design. Back then it probably took more than six months.

Also, if you are not used to seeing exponentials plotted on logarithmic plots, those straight lines look reassuringly unthreatening. You pretty much have to plot exponentials on a logarithmic plot since you can't really fit anything else into a graphic on a page. The Computer History Museum has a graph that shows what Moore's Law is really like. Flat...flat...flat...vertical. The Museum can't even fit the plot in the building!

# Rule of 70

Another set of people who understand exponentials, although typically much lower growth-rate exponentials, are people in finance, who deal with compound interest and CAGR (Compound Annual Growth Rate, usually pronounced cagger). Compound interest even impressed Albert Einstein who said:

Compound interest is the eighth wonder of the world. He who understands it, earns it; he who doesn't, pays it.

One thing that I'm always surprised more people don't know is the "rule of 70". If an investment, market, or economy is growing at X% per year, then it will double in size in 70/X years. So if the market for automotive semiconductors is growing at 10% per year, then it will double in size in 70/10 = 7 years.

Alternatively, if you know the doubling time in years (say D), then 70/D is the annual percentage growth (or interest) rate.

You don't need to read the rest of this paragraph if you're not interested in why this works. ln(2) is 0.693, which is close enough to 0.7 for this purpose. We get from 0.7 to 70 because we are using a percentage interest rate (5%) instead of a fraction (0.05). The exact doubling time is ln(2) / ln(1 + X/100) which is close to 70/X. The 2 comes from the fact that we are interested in doubling. If we were interested in how long it took to get 10 times as large, we'd need to use ln(10). Sometimes the rule is stated as 72/X or 69/X. The actual doubling period also depends on whether the growth is continuous or whether it is interest compounded (say) monthly.

**Sign up for Sunday Brunch, the weekly Breakfast Bytes email.**