Recently I ran into Lip-Bu Tan, our CEO, after one of his round-table talks, and asked him what he’s been reading lately. He said (and I paraphrase because I didn’t have my notebook with me to write it down verbatim), “I have been reading about quantum computing. All about quantum computing.”

So I thought I had better develop a better understanding than the cursory one that I wrote when writing about Lip-Bu’s The Top Five Disruptors to our EDA world that he talked about at CDNLive Silicon Valley, last April.

# Light

Remember a few months ago when I was talking about quantum physics (in How Do You Change What Happened in the Past?)? You might want to review it really quick, if just to remind yourself about how light can be both particles and waves, and how when you’re looking at light as a wave, it is waves of probability that you are seeing. When you then know where a light particle lands, the wave pattern breaks down and you only have one possible outcome.

Okay, got it?

# Beyond Binary: Superposition and Entanglement

Now think about coin-tossing. While the coin is spinning in the air, you can say that the coin is both heads *and* tails. Before the coin lands, we genuinely have two different coexistence states of existence in ** superposition**. (When the coin lands, you collapse the probability of one result to 0% and the probability of the other result to 100%.) This spinning coin, or “qubit”, can exist in a state of 1, 0, or both 1

*and*0.

(From this website:

Qubits can be in the “|0⟩” state (called a zero-ket), the “|1⟩” state (called the one-ket), or a linear combination of the two (superposition). The half-angle bracket notation |⟩ is conventionally used to indicate qubits, as opposed to ordinary bits.

Now let’s imagine an experiment with ping-pong balls and boxes. (This is nicer than thinking about killing cats, as Schrödinger did in his famous thought experiment.)

Imagine you have a contraption that, when you push a button, will randomly deposit a ball into one of two boxes. You only have one ball, and you can’t see which box the ball will go into. After pushing the button and before you open the box, three things are true:

- The boxes are in a state of superposition—that is, both boxes have or don’t have balls in them
- There is a percentage of possibility that either box will have the ball
- When you open the box and find there
*is*a ball in it, you know that the other box does*not*have a ball.

This this last thing shows ** entanglement**. Those boxes are quantumly entangled, no matter where in the universe these boxes exist. By observing the contents of either box, you know what happened in the other box, faster than the speed of light.

In classic computing, we could say each of those boxes contained information, a zero or a one, a single bit of data, but we have to open the box to get that information. In quantum computing, before we open the box, we have a qubit, instead, or a quantum bit—that is, a zero, a one, or both—and we don’t even have to open the box. (Eight qubits form a ** qubyte**.)

*This is a Bloch Sphere, used to represent a qubit.*

Now, let’s think about classic computing. Let me remind you that current computing is a binary system—a hugely complicated system of 1’s and 0’s, “on”s and “off”s, derived from logic gates (currently mostly about 14nm in diameter). A “bit” is the 1 or 0, and a classic byte has 8 bits. With 2^{8} or 256 possible permutations of zeros and ones, we can create every character, number, letter, or symbol that we need to perform all the individual tasks that we require from a computer.

But on a qubyte, you could, in theory, be storing all 256 of the possibilities at once; all combinations of zeros and ones exist in one qubit. And this is exponentially expandable—if you have, say, twenty qubits, that is, 2^{20} permutations of 0 and 1, you can store over a million values in parallel.

*This shows the Bloch Sphere with a bunch of possibilities.*

# Gates

In classical computing, a normal **logic gate** (like **AND**, **OR**, and **NOT**) takes a simple set of inputs and produces one definite output. The answer is certain, because of the way the logic gates are set up and the input produces one answer.

A **quantum gate** manipulates an input of superpositions and rotate probabilities and produces another superposition as its output. From this output, certain algorithms can be applied to pick the probability of the correct answer, collapsing the superpositions to an actual sequence of zeros and ones. What this means is that you get the entire bunch of calculations that are possible with your setup, all done at the same time.

Ultimately, you can only measure one of the results, and it will only *probably* be the one you want, so you may have to double-check and try again. But by cleverly exploiting superposition and entanglement, this can be exponentially more efficient than would ever be possible on a normal computer that has to perform all of the calculations in sequence to arrive at the final answer.

So, think of solving a maze. To solve it, a classic computer tests every possible solution one at a time, eventually coming to the final solution. A quantum computer looks at the maze in its entirety, showing the final solution as the likely solution. Even if you have to perform the calculation a few times to confirm the right answer, this method is vastly more efficient than performing the calculations in sequence.

*On the left, a solution solved by traditional computing; on the right, the same maze solved using quantum computing.*

# Why Is Lip-Bu So Excited by Quantum Computing?

While current computer systems are incredibly powerful, performing an unthinkable number of tasks in sequence, there are limitations to what they can do. Adding variables that increase exponentially is a problem that current computers just can’t handle with logic gates and an only binary way of handling bits of data.

The most common practical uses for quantum computing I have read about are security algorithms, efficiency applications, AI applications, and, what I think is most important to Cadence, simulation applications.

Any system that has variables can be simulated to determine all the possibilities that can happen within the system. And as any complex system engineer knows, changing only one logic gate adds yet another exponent of complexity. When designing any system—chip, board or system—before building the thing, designers need to simulate, emulate, and verify that their system is the most efficient, most powerful, and uses the least amount of power as possible.

Right now, to perform some of these simulations, companies have to use as many as hundreds of servers to verify their design—and with the addition of offering verification in the cloud, that brings even more servers into the mix. It’s still a lot of computing power, but even with the biggest supercomputers in the world added into the mix, no classic computer can perform all of those calculations.

*See what I did there? Verification in the cloud? I amuse myself.*

But theoretically, with the use of quantum computers, all of those calculations can be done at once, with a tiny fraction of the computing power. Verification and simulation and emulation won’t have to be that difficult.

Therein lies his excitement—at least, as far as I can tell. (I would have to ask Lip-Bu directly to confirm.) It could be that quantum computing will be required for blockchain technology. It could be that quantum computing is ideal for neuromorphic computing. It could be that quantum computing can be used for *machinelearningdeeplearning* and AI applications, which has become hugely important to Cadence lately, both for designing chips and the all of the machine learning applications that use chips.

No matter. It’s important. Quantum computing may be the bump over the edge of Moore’s Law that seems to be coming to its end. And because I trust that Lip-Bu has his finger on the pulse of the future of computing, I’ll continue to keep an eye on quantum computing developments.

*—Meera*

P.S. I had to do so much research for this blog post. If what I have written doesn’t quite make sense, or if you have further questions, check out the following links that helped me the most:

- This was very useful to me, if you have 20 minutes to spare:

cs_setInnerHtml('video_652ac528-f185-462c-a106-a3ed0d5c16dd','<iframe type=\"text\/html\" src=\"\/\/www.youtube.com\/embed\/wgCuKTN8sX0\" frameborder=\"0\" width=\"157\" height=\"94\" allowfullscreen><\/iframe>'); - This is Dr. Talia Gershon’s explanation of quantum computing, an 18-minute video of a presentation at a Maker’s Fair, and it’s kind of like a TED talk. She seems to be one of the biggest quantum computing evangelists out there, and her talk is fascinating. She’s at IBM.

cs_setInnerHtml('video_37b1b773-d76a-4ad5-96ab-b76c0366ae32','<iframe type=\"text\/html\" src=\"\/\/www.youtube.com\/embed\/S52rxZG-zi0\" frameborder=\"0\" width=\"157\" height=\"94\" allowfullscreen><\/iframe>'); - This is IBM’s Beginner’s Guide that Dr. Gershon talks about in her presentation, which was a nice explanation, going into a bit more detail than I do here. IBM seems to be the company most on the cutting edge of quantum computing, even allowing the public to play with their five-qubit computer.