A Turning Point For Quantum Computing

Quantum computing is moving from theory and experimentation into engineering and applications.

But now that quantum computing is going mainstream, it is incumbent on businesses and governments to understand its potential, for universities to beef up their teaching programs in quantum computing and related subjects and for students to become aware of promising new career paths.

Quantum computing got its start at a smaller, now famous, conference in 1981, jointly hosted by IBM and MIT, on the physics of computing.

There, the Nobel Prize-winning physicist Richard Feynman challenged computer scientists to invent a new kind of computer based on quantum principles, so as to better simulate and predict the behavior of real matter. Matter, Feynman reasoned, is made of particles such as electrons and protons that obey the same quantum laws that would govern this new computer’s operation.

Ever since, scientists have grappled with Feynman’s dual challenge: understanding the capabilities of a quantum computer and figuring out how to build one. If any consensus emerged from a recent conference in Yorktown Heights, it was that quantum computers will be very different from today’s computers, not only in what they look like and are made of, but, more importantly, in what they can do.

Quantum computing works in a fundamentally different manner from today’s computing. A traditional computer makes use of bits, where each bit represents either a one or a zero. In contrast, a quantum bit, or qubit, can represent a one, a zero or both at once.

Therefore, two qubits can be in the states 00, 01, 10 and 11 all at the same time. For each added qubit, the total number of potential states doubles. The use of qubits could enable us to perform calculations vastly faster than is possible with traditional computers — indeed, a conventional computer of any scale or speed simply won’t be able to emulate what a quantum computer of 50 to 100 qubits could do.

In recent years, scientific advances have been coming with increasing frequency, just as the search for new kinds of computing has become more urgent because researchers are beginning to reach the limits of Moore’s Law.

More than 8,000 articles on the topic were published in academic journals last year alone, and many of them came from engineering professors rather than information theorists or physicists. At the same time, opinion in the scientific community has been coalescing around a handful of approaches that are considered most promising.

Scientists have grappled with understanding the capabilities of a quantum computer and figuring out how to build one.

For decades, it seemed that the dream of building a full-fledged quantum computer was always 20 years off — over the horizon of predictability. Now, leaders in the field talk about a breakthrough potentially within the next 10 years.

Most of today’s research in academia and industry is focused on building a universal quantum computer that can be programmed to perform any computing task.

The major challenges include creating qubits of high quality and packaging them together in a scalable way so they can perform complex calculations in a controllable way — limiting the errors that can result from heat and electromagnetic radiation.

Tech companies and researchers are focusing on an approach called quantum annealing — which is aimed at producing something less than a universal quantum computer. These machines are limited to extremely narrow use cases. Also, while simulations of universal quantum computers show strong evidence of quantum speed-up, so far, it’s unclear whether quantum annealing produces better results than could be achieved using conventional computers.