Quantum computing and related technologies – including quantum sensors, communications, and security – are among UK innovators’ strongest suits.
But while much of the media attention in this area has been on non-classical hardware, one British start-up is focusing on creating algorithms for today’s rudimentary devices, and for the more impressive quantum systems that will emerge over the next few years.
Earlier this month, Bristol-based Phasecraft presented a number of papers at the annual Quantum Information Processing conference in Ghent, Belgium.
Fresh from that event, the company says its immediate aim is to significantly reduce the timescale for achieving quantum advantage. That’s the point at which quantum computers will demonstrate superiority over their classical counterparts by solving a problem that would be impossible or impractical to solve by other means.
Phasecraft is doing that by developing more efficient algorithms that will drive the best performance from current and emerging quantum computers, while informing the development of next-generation hardware.
Ashley Montanaro is Phasecraft’s Director and co-founder, and Professor of Quantum Computation at the University of Bristol. He explains:
To make quantum computers do anything interesting and useful, you need to have a quantum algorithm. It’s not enough just to have the hardware itself. In particular, we’re focusing on the near-term era. We want to come up with algorithms for the devices that exist now, or over the next two-to-five-year period, rather than the ones that will exist in 10 or 20 years’ time.
But we think that there are a lot of exciting applications that will lie within that time period. In particular, to do with modelling and simulating quantum physics, which is critically important for designing things like new batteries or solar cells, or new high- temperature superconductors. Because all of these systems are where quantum mechanics play a crucial role, which means our best classical methods are not able to keep up.
In other words, if you want to model big, binary things, build a big, binary system. But if you want to model the subatomic world, build a quantum system that runs quantum algorithms.
The real dividing lines
Accordingly, Phasecraft is partnering not just with the likes of Google, IBM, and quantum integrated circuits maker Rigetti Computing, but also with specialist chemicals multinational Johnson Mathey and Oxford University spinout Oxford PV, which develops photovoltaic cells.
These are among the world experts in technologies like batteries and solar power, and they are providing us with the problems and applications-oriented knowhow that will really make our algorithms relevant to practice.
It’s not just about number-crunching. Because quantum computers can represent quantum systems natively, you can represent the quantum state of a quantum system directly on a quantum computer. Whereas to even write down those states on a classical computer takes an exponential amount of information and processing.
Let’s say you’re trying to screen 100 different novel materials for their suitability for a battery anode, you can’t do that reliably in classical software. So, you have to actually do the experiments in the lab. But it’s just not feasible to do this with huge numbers of novel materials.
So, the promise of quantum computing is that we’ll be able to screen very large numbers of potential materials, chemical compounds, and so on, for their properties. And then maybe test just a few of those out in the lab to check that the quantum computer gave the right answer. It’s all about the inherent quantum mechanical structure of the problem.
So, what might the critical difference be between today’s quantum systems and those that emerge in 10 or 20 years’ time? Why does the company have a relatively short-term focus when quantum tech would seem to have a very long tail in terms of investment and practical applications?
The real dividing lines will be error correction and fault tolerance. In the long run, we expect that we’ll have quantum computers that can run for an arbitrarily long time before breaking, because they’re going to use quantum error correction to handle the noise and errors that occur during the computation. [Quantum circuits are inherently noisy and susceptible to interference, including from neighbouring qubits.]
Whereas near-term quantum computers, like so called NISQ [noisy intermediate-scale quantum] devices, will use error mitigation, not full-blown quantum error correction. They’re not yet able to run for arbitrarily long periods. So, it’s extremely important to make the algorithms as efficient as possible and use as few time steps as we can.
But the sort of algorithms that we’re coming up with now will still be relevant, I think, in the long term to fault-tolerant, large-scale quantum computers. It’s just that they’re more relevant in the short term!
Then he adds:
The quality of your cloud-based quantum user interface won’t mean very much if your algorithm isn’t doing anything useful. But the challenge is that nobody has yet shown, convincingly, a quantum computer outperforming a classical one for a problem that people really care about, as opposed to a scientific demonstration. And before you get to that point of quantum advantage, all the other things you can do with cloud interfaces, the user experience, and so on, for us are a lower priority.
But to most IT and business strategists, nearly all of Phasecraft’s own research would seem to be highly technical and theoretical. While it is still in the research lab stage, how can the industry persuade organizations to get behind an investment in quantum computing now as a serious candidate for enterprise applications?
We believe that in the next few years we will start seeing genuinely, practically relevant applications of quantum computing – albeit in specialised domains initially.
But once we get beyond these applications in materials science, chemistry, and pharmaceuticals, in the longer term I expect that quantum computers are going to be useful in many domains where there are complex computational problems to solve. Not because the amount of data involved is really big, but because the problem itself is really, really hard.
For example, within finance there’s the problem of portfolio optimization, where we want to produce a basket of assets that has optimal performance in financial returns. And with these assets, there might be complex correlations between them. So, working out which is the optimal financial portfolio is actually a very difficult optimization task. But it’s something I believe, in the long run, quantum computers will be useful for.
But the key point about this technology is it is something that has a long lead time, and so forward-looking organizations should be building up their teams gradually. You can’t just jump in, let’s say three years from now, and hope to build a large team of quantum experts overnight. Now is the time to start engaging with the field and building up your knowledge of it.
With quantum computers, there are really only two numbers you need to pay attention to. One is the number of qubits, and the other is the fidelity – the level of accuracy of the quantum gates or how badly affected by errors they are.
Scaling up the number of qubits is an engineering challenge, but it’s an easier one than improving the fidelity of the gates, the quantum operations in the quantum computer.
In the next few months we will probably see quantum computers with thousands of qubits, and over the next few years, I would expect tens of thousands, even hundreds of thousands. But with fidelity, perhaps a factor of 10 or 100 better than where we are now.
Like Phasecraft, IT leaders need to see the big picture of quantum things.