Quantum Computing

Quantum computing is not a panacea yet

Quantum computing promises a revolution in scientific discovery. Quantum computing’s main advantage over digital computing is in quickly solving highly complex problems that require significant time or resources to process. Currently, solutions to many complex problems can be estimated using supercomputers or pools of servers over days or weeks. Other problems are so complex that they cannot be solved in human timescales using today’s technology. Although research is progressing, there is no guarantee that a practically useful quantum computer can ever be realized.

The impact of a fault-tolerant quantum computer on science and engineering would be vast. For example, it could help the global economy reduce global carbon emissions by finding a better process for producing ammonia than the currently used, energy-heavy Haber-Bosch process.

For companies involved in quantum computing, such as IBM, Microsoft, Honeywell, Amazon, D-Wave, Intel and IQM, quantum computing presents a significant new revenue stream. But it threatens to displace existing applications that use supercomputers or server pools to estimate solutions to tough problems.

Data centers, physical servers and applications used for complex processing tasks could be most vulnerable to being displaced by quantum computing, particularly modeling applications in fields such as finance, biochemistry, and engineering. For many applications, however, the cost and complexity of quantum will not guarantee a return on investment. Most quantum computer prototypes today require operating temperatures near absolute zero, huge upfront investments (although many are now accessible remotely as a cloud) and very specialized skills — this is a significant and costly undertaking.

So, what is quantum computing? In a digital computer, chips use transistors that are either switched on or off, representing 1s and 0s. Digital circuits use this mechanism to perform logic and memory functions, ultimately enabling complex IT systems and software. Electrical signals, typically encoding binary information, represent the flow of digital information.

Digital computers are deterministic systems, meaning they need to proceed with calculations step by step and can only mimic randomness in mathematics and the chaotic nature of the real world. They have another limitation: their numerical precision means they can only approximate when modeling nature. Even with high clock speeds and the massive parallelism of processing resources of supercomputers, there are many complex problems classical computers cannot practically attack. There are too many steps to go through or too much simplification required to be practical. Financial modeling, route optimization and protein folding in molecular biology are just a few examples of areas where classical computing hinders progress.

The quantum in quantum computing refers to tiny particles on the very limit of measurement. At the quantum scale, particles act spontaneously and randomly. Particles only assume a known state (i.e., their movement and position) once they are measured. A quantum particle “decides” what it is doing only when observed. Consequently, the results of observations are probabilistic. Unlike the clear, definite states of traditional computers, quantum bits, called qubits in a quantum computer, take multiple probabilistic states (or superpositions) between 0 and 1. The unobserved particle has essentially yet to decide its definite state.

Rather than looking through every possible combination of a problem, quantum computers use the inherent uncertainty in quantum particles and apply physics to solve the problem. Imagine we are trying to find the correct combination from a sequence of 0s and 1s that lies between 000 and 111. We could either go through every possible combination in series using a digital computer, or we could create three qubits and put them into a superposition state where we don’t know what they are doing. We can then perform operations on these qubits using lasers or electromagnetic fields.

Crucially, we manipulate these particles without looking at them — they remain in superposition. Once we have performed our operations, we then measure the particle’s state. The uncertainty between 0 and 1 then “collapses” into the correct combination.

This highly simplified example may introduce more questions than provide answers. The concepts in play at a quantum level are not intuitive, and the math is not simple. The critical point is that a quantum computer allows operations to be performed across a vast range of potential answers in as little as one step. In a digital computer, each possible solution must be investigated separately.

In the real world, however, building a quantum computer large enough to be practical has so far proved elusive. The problem is that controlling and protecting quantum states at such small scales is arduously tricky. Any interference, such as a stray particle, may bump into the qubit particle, changing its state by accident and leading to a loss of coherence. Errors that prevent coherence from lasting long enough to perform valuable and correct calculations are the biggest block to a viable quantum computer. The more qubits, the more difficult these errors are to control. Fault-tolerant quantum computers with hundreds, or even thousands, of useful qubits are the holy grail of quantum computing research.

Even though there is no guarantee that large-scale quantum computing is viable, scientists, government sponsors and investors either want to believe it is possible or don’t want to risk missing out on its daunting possibilities. Perhaps a more prominent indicator is that there are reports that some governments are collecting encrypted data in the hope they can crack it using a quantum computer later. As such, quantum-proof encryption is increasingly being considered in high-security use cases. But even quantum pioneers say we’re at least a decade away from a practical quantum computer.

Even if quantum computing is realized technically, it will probably not be as disruptive as some have forecasted. Practicalities, such as accessing a quantum computer and programming it, are far from simple. Cost will be prohibitive for many applications. As a result, it will not displace digital computers, including supercomputers, either. Instead, quantum computers will augment existing computing infrastructure as a new accelerated computing platform.

Share this