Why quantum ‘utility’ should replace quantum advantage

As the quantum computing industry continues to push forward, so do the goal posts.

A long-sought objective was to attain quantum “supremacy” — demonstrating that a quantum computer could solve a calculation that no traditional computer on Earth could do — without requiring a practical benefit.

Google claimed to reach that goal with its landmark scientific paper in 2019, but IBM notably expressed skepticism. In any case, it was an exercise in computer science that was not of any practical relevance in the real world.

Since the Google announcement, the industry intensified its efforts in attaining quantum “advantage,” which is defined as achieving a business or scientific advantage by exceeding the computing capacity of the largest supercomputers in a relevant application.

As a reference point for comparison and benchmarking, it was certainly more useful than quantum supremacy. Quantum advantage is often connected to achieving major breakthroughs in drug discovery, financial trading or battery development.

However, quantum advantage ignores one important point: Should we really wait around for million-qubit quantum steampunk golden chandeliers to outperform supercomputers before we consider quantum computers meaningful? Or should we focus on measuring performance improvements in comparison to the units of hardware we use in today’s classical computers, e.g., individual CPUs (central processing units), GPUs (graphics processing units) and FPGAs (field programmable gate arrays)?

Because what may be a more valuable goal for this still-nascent industry is achieving quantum “utility,” or usefulness, as soon as possible. Quantum utility is defined as a quantum system outperforming classical processors of comparable size, weight and power in similar environments.

Accelerating commercialization

Those who have examined quantum computing in any depth know the massive impact it will have on IT, business, the economy and society. A future of quantum supercomputing mainframes with exponential speedup, error-corrected qubits and quantum internet will be a very different world than the one we live in today.

That said, similar to the classical mainframes of the 1960s, quantum mainframes will likely remain large and fragile machines for the foreseeable future, requiring ultra-low temperatures and complex control systems to operate. Even when fully operational, there will only be a few quantum mainframes located at supercomputing and cloud-computing facilities around the world.

The quantum computing industry would be better off emulating the success of classical computers. When personal computers arrived in the late 1970s and early ’80s, IBM and others were able to market new models every year that offered incremental improvements over the previous models. This market dynamic is what drove Moore’s law.

Quantum computing needs a similar market dynamic to scale and thrive. Investors can’t be expected to keep handing out money waiting for quantum computers to outperform the few supercomputers. An annual release of new, improved and ever more “useful” quantum computers will provide the revenue assurance that will drive the long-term investment required to achieve the technology’s full potential.

With a never-ending supply of useful quantum systems for a variety of applications, there is no reason to wait in line to process a calculation on one of the few massive quantum mainframes available in the cloud when you can have a quantum processor right next to you, integrated with your existing classical systems. Your application may require a calculation instantaneously that “quantum in the cloud” can’t deliver in time, or you may have to rely upon on-premise or on-board compute if there is no cloud access possible.

Extending the idea of quantum utility, you can imagine the following scenarios:

  • Signal and image processing in autonomous and intelligent technologies at the network edge in robots, driverless vehicles and satellites.
  • Industry 4.0 end-point applications such as digital twins in manufacturing facilities.
  • Distributed network applications such as battlefield situations in defense.
  • Classical computing accessories, providing a boost as needed for laptops and other common devices.

Room-temperature quantum computing in small form factors will be required to achieve these quantum “accelerator” applications in the next few years. There are several approaches being undertaken, but the most promising is using so-called nitrogen vacancies in diamonds to make qubits.

Enabling technologies

Room-temperature diamond quantum computing works by leveraging an array of processor nodes each composed of a nitrogen-vacancy (NV) center, or defects in the ultra-pure diamond lattice, as well as a cluster of nuclear spins. The nuclear spins act as the qubits of the computer, while the NV centers act as quantum buses that mediate the operations between the qubits and their input/output.

The primary reason why diamond quantum computers can work at room temperature is because the ultra-hard diamond serves as a kind of quantum mechanical dead space where qubits go to survive for a few hundred microseconds.

Quantum scientists from the University of Stuttgart in Germany pioneered many diamond quantum computing achievements in algorithms, simulations, error correction and high-fidelity operations. However, they hit a roadblock when they tried to scale systems beyond a handful of qubits because of challenges with qubit fabrication yield and precision.

Since then, Australian quantum scientists have found a way to address scaling issues, as well as the miniaturization and integration of the electrical, optical and magnetic control systems of diamond quantum computers. Their work will enable the scaling up of qubit numbers while simultaneously scaling down the size, weight and power of diamond quantum systems.

The scientists further demonstrated that compact and robust quantum accelerators were possible for mobile applications in robotics, autonomous systems and satellites, as well as massively parallelized applications for simulating molecular dynamics in drug design, chemical synthesis, energy storage and nanotechnology.

Because of diamond-based computing’s unique benefits, there is currently a global research effort underway involving leading academic institutions such as the University of Cambridge and Harvard University. The Australian National University’s diamond-based quantum computing research has moved into an initial commercialization phase.

Other types of quantum computing technology operating at room temperatures in relatively small form factors are advancing as well, including trapped-ion and cold-atom quantum computers. However, these come with requirements of either vacuum systems and/or precision laser systems. One quantum computing startup has successfully developed a trapped-ion system that fits in two server racks. However, it’s uncertain whether these types of systems can be miniaturized even further.

Recalibrating assumptions

For the industry to achieve the vision of quantum accelerators that deliver quantum utility, the technology must be compatible with scalable semiconductor manufacturing processes, where the qubits are formed and integrated with control systems that are robust, low-maintenance and have a lengthy operational lifespan. As classical computers showed us, the best way to do this is to miniaturize and develop integrated quantum chips.

Similar to the first transistors at the dawn of ubiquitous classical computing in the 1960s, the key technical challenge to achieving widespread quantum utility will be in the fabrication of integrated quantum chips. However, like classical computing, once this fabrication is performed, the devices can be simple to use and deploy.

While early useful quantum systems have considerably fewer qubits than quantum mainframes, they may become a focal point of the industry and prospective markets once the first integrated chips are manufactured.

The downstream implications are almost unimaginable — there may not be an area where a room-temperature quantum system won’t make a fundamental change in how we solve problems. There is a clear message to all the product designers, software developers, market forecasters and social observers here as well: The time to get your head around quantum computing is now.

In the near term, useful quantum computers will massively disrupt supply chains, even entire value chains. Preparing for that impact implies understanding not only the technology, but also its economic impact. And of course, there are incredible investment opportunities in a technology moving this fast as well.

Quantum utility also means the future of quantum can be heterogeneous — accelerators can exist alongside mainframes and be deployed for different reasons and applications. It will encourage a groundswell of cooperation as opposed to direct competition — and accelerate innovation and adoption in the quantum industry.