Why now is the time to get ready for quantum computing

For the longest time, even while scientists were working to make it a reality, quantum computing seemed like science fiction. It’s hard enough to make any sense out of quantum physics to begin with, let alone the practical applications of this less than intuitive theory. But we’ve now arrived at a point where companies like D-Wave, Rigetti, IBM and others actually produce real quantum computers.

They are still in their infancy and nowhere near as powerful as necessary to compute anything but very basic programs, simply because they can’t run long enough before the quantum states decohere, but virtually all experts say that these are solvable problems and that now is the time to prepare for the advent of quantum computing. Indeed, Gartner just launched a Quantum Volume metric, based on IBM’s research, that looks to help CIOs prepare for the impact of quantum computing.

To discuss the state of the industry and why now is the time to get ready, I sat down with IBM’s Jay Gambetta, who will also join us for a panel on Quantum Computing at our TC Sessions: Enterprise event in San Francisco on September 5, together with Microsoft’s Krysta Svore and Intel’s Jim Clark.

Gambetta, of course, agrees that now is the time to start thinking about how to get ready. But he also noted that there’s still plenty of misunderstandings around quantum computing. “Not everything is going to be sped up by a quantum computer,” he said.

“That’s kind of the myth and that goes back to people thinking that classical computers can do everything. Computers are so good that people have forgotten that there are problems that are really hard for classical computers. And we know they are hard, so we find ways around them, or we don’t even you attempt to solve them. So what quantum does, it gives you a different lens that allows you to look at problems that you would never be able to look at with a classical computer.”

The principles of quantum computing are very different from those of classical computing, though, so developers need to develop a new kind of intuition for them. “We don’t see superposition — or experience superpositions — in our everyday life or experience entanglement. So how to make an analogy for something when you live in a classical world. That is a different set of equations that govern how things behave,” Gambetta said.

In the current state of the industry, the mission for a lot of companies, including IBM, is to get early machines — or even just simulators out there — so that developers can start developing this intuition. IBM and Rigetti make their quantum computers available in the cloud, for example. Microsoft, which doesn’t have a working quantum computer year, offers a quantum programming language and a simulator.

Gambetta believes that it will take people five to ten years (or maybe more) of using quantum computers to develop this intuition. For the time being, we can still use classical computers to simulate the current set of physical quantum computers.

“That’s why we want to start early and get the quantum computers into [developers’] hands, even while we can still simulate the current devices on our classical computers because it won’t be long before they will be beyond what is possible,” he noted.

Today, we can simulate an ideal machine up to fifty qubits. But there’s a winkle there, because those ideal machines don’t have any noise, something that in many ways defines quantum computers.

When you start simulating noise, a lot of the resources go to that alone, but in real machines, noise is going to be one of those things that developers will have to contend with. And developers will have to start developing algorithms that are resistant to the noise that these machines will have, but today’s classical computers can only simulate the noise of machines of up to maybe ten qubits.

Obviously, there’s still lots of work to be done before we arrive at a general-purpose quantum computer, but that’s also what makes the current state of the technology so exciting and Gambetta believes that this gives the community a chance to influence this next generation of computing now and — maybe most importantly — to have the tools in place to make use of these machines once they arrive.

“I think we have this unique opportunity to bring in the next wave of computation. If we follow the path that we did last time, where we develop the technology, people write compilers, and people use compilers and then eventually people did applications — that takes a long time,” he argued. “If we can shorten that down by developing the compiler in the open, working with clients on applications and giving systems to people to explore and see what they can do, I think that allows us to reduce that timeline.

On the other hand, though, enterprises are now looking into actual practical applications of quantum computing. Chemistry problems, especially, is what Gambetta thinks companies should focus on right now because quantum computers are ideally suited to solving them.

In addition, he’s also bullish on quantum machine learning. Not to speed up standard machine learning we do today, but to offer different types of feature-space mappings that are equivalent to quantum circuits that would be very hard for classical computers to do.

publications background

For now then, given the limitations of what’s possible today Gambetta believes that developers should focus on ‘toy problems.’ “You can come up with a sort of interaction that goes on inside a lithium ion battery or something like that,” he said. “You can try that and you can say: we can test this on a quantum computer to look at scaling. And that’s important to asses how this is progressing for this kind of application.”

And that’s pretty much the state of the industry right now: developers can start to work on problems that go beyond trivial research experiments but they can’t quite run complex applications just yet. That’s what also makes today so exciting for quantum computers. There’s hardware, there’s software and now developers and researchers get to define how we’ll use these machines once they come out of the lab and reach the point where they can’t be modeled on a classical machine anymore.