Get email delivery of the Cadence blog featured here
How do you know how many coffee beans you have purchased (besides counting them out, that is)?
Coffee is generally measured by weight (not by volume). In the U.S., this may be in pounds, but for the rest of the world, it’s in kilograms. So how do we know that a kilogram is a kilogram?
The kilogram was originally defined in the late 1700s as the weight of a cubic decimeter (a tenth of a meter) of water, and a meter was calculated as a fraction of the distance between the North Pole and the Equator (1/10,000,000th of half the earth’s meridian). If any country needed to create their own standard, they theoretically could.
But (and this is old news, it happened in 1870), physicist James Clerk Maxwell pointed out that “the Earth might contract by cooling, or it might be enlarged by a layer of meteorites falling on it,” changing its shape, and, with it, the length of the meter. Now, this might seem silly, but for precise measurements, these are real concerns. All of modern science using the scientific method requires that units be consistent across platforms.
Think about the basic things that you can measure using the metric system:
The International Bureau of Weights and Measures (or BIPM as it’s known from its French initials) determines what these basic units are by pegging them to the most constant definitions known. So, for example, the meter, which used to be measured by the inconstant earth itself, was changed in 1983 to the distance of light travelled in vacuum during 1/299,792,458 of a second.
And the second? It is not, as you might think, hitched to the turning of the earth; it is exactly “the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the cesium-133 atom” (at a temperature of 0°K). Because the Earth’s rotation varies and is also slowing ever so slightly, a leap second is periodically added to clock time to keep clocks in sync with Earth’s rotation.
Over the past few decades, six of the seven units of the metric system — the meter, the second, the ampere, the Kelvin, the mole, and the candela — have undergone the similar transformations from artifacts to fundamental constants.
Now it’s the kilogram’s turn for an update. Since its inception by the BIPM in 1889, it has been measured against a cylinder of platinum and iridium, known as “Le Grand K”, kept under two bell jars and under lock and key in Sevres, France. But even the loss of atoms or a fingerprint on this lump of metal can affect this standard. So what is the constant to which we can hitch this mass measurement?
I will be honest with you, I can’t fully explain it — certainly not within the confines of this blog. Let’s just say that researchers have built a Kibble Balance (this is a great site and explains it better than I can), and, using this extremely sensitive scale and applying the Planck constant, have come up with a new standard for the kilogram, which takes effect early next year.
But this doesn’t really affect how many coffee beans are in a kilogram, right?
Maybe not, but we in the semiconductor world are dealing with impossibly small things. I can’t help but wonder if the new definitions of weights and measures might affect how we measure the size of transistors or the heat they produce while they’re running. Of course, a hot spot is a hot spot no matter how you measure it. But if the standards by which we set the limits of, for example, how hot a transistor can run, changes its definition of “heat”, how do we know what is actually true? If the length of a second changes, a meter is refined, or a kilogram is adjusted to this new standard, how does this affect what we do at such fast, small, and lightweight things, like semiconductors?
I wonder if someone is looking into this.