Quantum Computing

 Spinning coin - curvature is due to CMOS rolling shutter
One of the biggest conceptual and architectural changes to computing is quantum computing - using the nature of quantum mechanics to change the way calculations are done. Modern computers already use quantum mechanics in some fashion due to semiconductors. However, that use is mainly to create logic gates that mirror the gates created with vacuum tubes and relays. In a quantum computer, the logic directly relies on important features from quantum mechanical probabilities: superposition and entanglement of states. Quantum computers are believed to be able to quickly solve problems that the fastest modern computers struggle to solve - the Quantum Supremacy (of quantum computers over classical computers for a type of problem). Theoretical work shows that widely used asymmetric key cryptography would be easily broken meaning secure data like credit card or password information would be insecure. There are limits as quantum computers won't solve problems that normal computers cannot solve now. Richard Feynman, the noted physicist, first mentioned quantum computers in 1959 and in the 80s, Paul Benioff and Yuri Mann developed the ideas for these computers.

Some Background 

Probability is the likelihood of something happening. Flip a coin and the probability of it landing heads up is 1/2. That's the same probability for landing tails up. Added together, the probability for heads or tails is 1 (assuming that the coin won't land and remain stationary on its edge). We think of probabilities as values between 0 and 1. 0 is no chance of an event occurring while a value of 1 indicates it will occur. Usually, the value is somewhere in between. With quantum mechanics, this is more complicated - probability in these systems comes from squaring the probability amplitude. That amplitude can be positive or negative and combining amplitudes that are positive or negative will create regions of stronger or weaker amplitude (it's more interesting as the amplitudes are complex numbers). Square those modified amplitudes to get probabilities. This is how atoms bond together in molecules, liquids or solids - the electrons (quantum mechanical wave-particles) of an atom can have positive and negative amplitudes that combined with electrons of other atoms create bonding and anti-bonding combinations. In mathematical terms, classical probability is like:
Sum(p_i) = 1
where each p_i >= 0.
In quantum mechanical terms, probability is like this:
Sum(|a_i|^2) = 1
where each a_i is a complex number and can be negative

Superposition is an important concept in understanding quantum mechanical systems. It's the concept behind Schrodinger's Cat if you've heard of that. A quantum mechanical system can be composed of multiple states all at the same time. It's like a spinning coin - heads, tails, heads, tails. Only when you measure the system/state or check the coin do you get a definite answer. Measurement causes a collapse of the superposition of states into one state. Before measuring, the 'spinning' or unchecked state of the coin could be seen like this:
|Total State> = (|head> + |tail>)/sqrt(2)
The probability of that state is the square (eliminate the head-tail situation):
1/2Heads + 1/2 Tails = 1 (as the total probability needs to be 1).  
While the measured state would be, if the coin is heads up:
|Total State> = |head>
with probability being 1 and Heads = 1.

Entanglement is when two quantum systems combine so that again there is a way of describing the total system as a sum of the two. Imagine that we had two coins and each must show the opposite of the other; if one is heads, the other has to be tails. These two coins would be entangled - while the coin analogy is simplistic, this does happen with electrons and something called spin. Two electrons can be in the same 'state' along as one is spin up and the other is spin down. (I put 'state' in quotes as the total state of each electron actually includes the spin.) If you measure one of the coins, if you check if it's heads or tails, then you instantly know what the other coin is. Einstein referred to this as a 'spooky action at a distance' (see a description of EPR). Recent experiments have shown that it's true for particles separated by hundreds of miles but are still entangled. Entropy (information theory entropy) can also be used as a measure of entanglement (the more entropy then more entanglement). To say this another way: for entangled think correlated in probability and that outcomes are related. If two probabilities are uncorrelated, then the outcome is simply the combined probability (P_coin1(heads) times P_coin2(tails)). Correlated outcomes aren't that simple and can't be decomposed like that. Entanglement makes it an inseparable whole. If heads is 1 and tails is 0, then the two entangled states would be: |01>+|10>.

If this is a little confusing, Feynman's comments: "safely say that nobody understands quantum mechanics." Which probably isn't true as Feynman understood it, but it's not easy to understand. The important part is that quantum mechanics allows parts of a system to interact in such a way that 'calculations' by the system can use the entangled probabilities to reach a solution much faster than a classical computer.

Quantum Computing

Normal computers have bits, 0s and 1s, in registers or memory and use these to perform calculations with normal digital logic gates. These 0s and 1s are held in memory with electricity and have to be bolstered frequently to keep them at 0 or 1. In a quantum computer, the qubit is the basic bit. As above, the qubit doesn't hold a 0 or 1, but holds a superposition of 0 and 1 - it holds those amplitudes mentioned above. Having a larger number of qubits means that they can be entangled. Entangling them means the group of qubits can hold all possible states at once. Two qubits can hold the states 00, 01, 10, 11 all at once. A classical computer would have to hold these as 4 separate states, requiring more bits. Ultimately, the number of possible solutions that a quantum computer can keep at once scales as 2^n where n is the number of qubits. A normal computer would need 2^n memory for the same number of states. A quantum computer with two qubits allows a superposition of 4 states, three qubits allows 8 states, ..., n qubits allow 2^n states. Qubits hold the state of the interaction while it reduces the memory required.

The result of a quantum calculation with n qubits would only return an n 'bit' answer even though the calculation ran over 2^n bit space. Again, with n bits in a classical computer, n amount of information can be encoded; a quantum computer will span 2^n space, but only return n bits of information.
This rapid increase in possible states and the ability to cover all of the states is what leads to the potential speed up in quantum computers.

Calculations are done by 'loading' the information into the qubits, influencing their state, much like setting the state of memory in a normal computer even if the method is very different. Depending on the type of quantum computer, this loading is done with microwaves or lasers. Operations are reversible and operate on the quantum mechanical analog of a normal n-bit register; classical computer gates are not reversible - you can't know the input from the output. (In normal computers, the NOT gate is reversible; also normal computers could be built with Toffoli gates which can be reversed using extra memory.) One of the most well-known quantum computing gates is the Hadamard gate which transforms specific states into a superposition of states and back again. As an example, in a classical computer running a probability calculation (random/stochastic/probability/transition matrices), the goal is to preserve that the sum of the probabilities = 1. In a quantum computer, operations like the Hadamard gate (unitary matrices) preserve that sum of squares of the amplitude = 1. Positive and negative amplitudes combine to increase or decrease the probabilities of some outcomes while keeping the overall probability of an outcome correct. The entanglement of the qubits means that affecting one affects all - in the two coins example above, setting one coin to heads meant the other was tails. Another analogy is pulling a red ball out of a bag of red and green balls - a normal computer would pull one, but quantum entanglement means that pulling one is like pulling two or more out.

Again, the quantum computer qubits hold all values until measured; a classical computer would hold one value in the same number of bits. Measurements are then done (again with microwaves or lasers) to 'collapse' the 2^n states into one that is the result with n bits of info (the original number of qubits). Measurement is not reversible.

Issues with Quantum Computers
Due to the nature of quantum computing, the outcomes are probabilistic - only provide a valid solution with some probability. This is true in normal computers, too, but the effort over decades of work has been to reduce the error to near 0 so that we trust the outcome. Almost no one thinks about error correction when using a normal computer these days. Error correction in quantum computing is in its infancy. The computers have to be kept super cold and isolated as much as possible as any noise could affect the calculation. Realistically, calculations would need to be run several times to make certain of the result. The errors aren't just value errors as in a classical computer, but errors in the way the qubits interact. While the quantum computer is built to have its qubits interact with each other as much as possible (entanglement), it's also built to keep it from interacting with the environment as little as possible! The noise can cause decoherence which means the qubits stop interacting as one group. Environmental issues like heat, noise, electromagnetic or other radiation all work against the system. The redundancy of calculations for error corrections have been estimated to be as high as hundreds, thousands or even millions of times; fixing this is a key area of research. A recent advance has reduced the number of qubits needed for cracking 2048 PKI and overcome errors from billions to millions. Scaling the lifetime of the qubits before decoherence - IBM in 2017 achieved 90 microseconds for 50 qubits.

Scaling quantum computers is another challenge not just to eliminate environmental factors. They need to be scaled in coherence time and number of qubits as well as programming techniques. A small number of qubits easily fit close to each other, but when the number of qubits grows very large, distances from each other become an issue. In terms of programming, the same challenge facing classical computers exists - simple hardware with complex languages or complex hardware with simple languages (or the worst of both). The difficulty with quantum computers is the complexity of understanding the operations. A computer with n qubits might require n^3 logic 'gates' or operations to achieve the results.

Current Work

There are two main types of quantum computers now: ion/atom traps and superconducting circuits. Ion traps use lasers to write, read and trap the ions suspended in the electromagnetic fields. The ions in the computer interact in a way that you could think as vibrating together. Similarly, atoms can be trapped in an optical lattice.

The other main type is superconductor based. These are often modeled on superconduction quantum interference devices (SQUIDs) small rings of current. Oscillating microwaves to set and read values. The devices are connected together via radio frequency channels which could be part of a printed circuit board. A company called D-Wave have been working on quantum computers for years. They structure the computer to solve the problem so requires almost no code; their computers don't connect all qubits together and can't solve all types of problems. Their computers resembling annealing systems using transverse tunneling fields (analogous to temperature in real annealing). Google and IBM are working on more generic computers that link all qubits. In 2017, IBM has produced systems with up to 50 qubits. In 2018, Google has raised that to 72 qubits. Compared to classical computers doing simulated annealing or quantum Monte Carlo calculations, Google says quantum annealing is 100M times faster. IBM offers a test quantum computing environment if you want to sign up.

Quantum supremacy is when a quantum computer is able to solve a classical computer science problem faster than a supercomputer (or any classical computer). It's uncertain how large a quantum computer will have to be to achieve this, but it's expected soon by some, others doubt it will happen. To be clear, quantum computers will solve the same types of problems that classical computers can, but they'll solve them much faster. They're both Turing machines. Those problems a classical computer can't solve like the halting problem or NP-complete problems will be unsolvable by a quantum computer, too. In acronyms, the quantum computer is BQP bounded error, quantum, polynomial time set thought to be bigger than BPP bounded error, probabilistic polynomial time set. In particular, quantum computers should be good at factoring large numbers, calculating optimized paths (like courier routes), quantum search (Grover's algorithm), and physics and chemistry problems around the quantum nature of the universe as nature isn't 'classical' it's quantum mechanical.

With the ability to factor large numbers quickly, quantum computers will crack traditional cryptography easily. In 1994, Shor developed an algorithm that can quickly factor large numbers or discrete logarithms. This algorithm based on  modular exponentiation and quantum Fourier transforms can find the factors of public key cryptography (PKI for public key infrastructure) and render PKI insecure. In 2016, NIST requested submissions of quantum-resistant cryptography algorithms. Currently, 26 algorithms are in the later stages of evaluation. The key weakness of PKI is the asymmetric key portion which relies on large prime numbers or discrete logarithms being difficult for classical computers to factor. Symmetric keys are much more resistant to quantum computing, but Grover's algorithm do make them susceptible - thankfully, doubling the symmetric key size is often enough in a post-quantum world.

Some approaches to post-quantum cryptography include one time keys (hashes or OTP), simultaneous/multivariate equations, lattice-based and error correcting codes. All rely on the difficulty of decoding the information without all the input or the time it takes to decode. Current PKI also relies on the hardness of solving problems. It's possible a classical computer can crack a PKI key as well, but the algorithms have been designed to make it take thousands of years.

The drive to improve quantum computers is growing as being the first with a viable quantum computer is important financially and strategically. Microsoft has entered the race with Google, IBM, D-Wave and researchers around the world. There's a course to learn to program for quantum computers from Microsoft and Google, IBM has a system online for learning more,  Google has it's own playground and scripting language, Microsoft has announced a language Q#, and researchers at labs are driving this forward.


Popular Posts