In the race to build a workable quantum computer — a dream at the intersection of advanced physics and computer science since the 1980s — the finish line may be in sight.
A spate of recent technical breakthroughs means that leading tech companies are vying to become the first to expand what until now have been a series of lab experiments into full-size, workable systems.
In June, IBM became the latest to claim its path was now clear to a full-scale machine, after publishing a blueprint for a quantum computer that filled in critical missing pieces from its earlier designs. Quantum computers hold the potential to solve problems beyond today’s machines in fields such as materials science and AI.
“It doesn’t feel like a dream anymore,” said Jay Gambetta, head of IBM’s quantum initiative. “I really do feel like we’ve cracked the code and we’ll be able to build this machine by the end of the decade.”
That has intensified a race against Google, which cleared one of the biggest remaining hurdles late last year and says it is also on course to build an industrial-scale quantum computer by the end of the decade.
“All the [remaining] engineering and scientific challenges are surmountable,” said Julian Kelly, head of hardware at Google Quantum AI.
Yet even as they put some of the hardest science problems behind them and gear up for a sprint to the finish line, the companies still face a raft of more routine-sounding but still difficult engineering problems to industrialise the technology.
The remaining hurdles “seem technically less challenging than the fundamental physics, but we should not underestimate that engineering effort to scale”, said Oskar Painter, the executive in charge of quantum hardware at Amazon Web Services. He predicted that a useful quantum computer was still 15-30 years away.
Reaching industrial scale means taking systems that comprise fewer than 200 qubits — the basic building blocks for quantum machines — and expanding them to 1mn qubits or more. The companies involved compare this to the early days of conventional computing, though quantum computers pose additional challenges.
Among the toughest is the inherent instability of qubits, which maintain their quantum states — when they can perform useful calculations — for only tiny fractions of a second. That leads to incoherence, or “noise”, as ever-larger numbers of qubits are added.
One graphic demonstration of the limits of scaling came when IBM increased the number of qubits in its experimental Condor chip to 433, leading to “crosstalk”, or interference, between the components.
Stacking larger numbers of qubits together like this “creates a bizarre effect we can’t control anymore”, said Subodh Kulkarni, chief executive of Rigetti Computing, a US start-up that also works with qubits made from superconductors, the same technology used by IBM and Google. “That’s a nasty physics problem to solve.”
Gambetta said IBM had anticipated the interference seen in its Condor chip, and that it had moved on to a new type of coupler to link its qubits, making its systems less susceptible to the problem.
In the first experimental systems, qubits have been “tuned” individually to improve their performance. Complexity and cost make this impractical at larger scale, leading to a pursuit of more reliable components — something that will require steady improvements in manufacturing, as well as new breakthroughs in materials. Google also says it aims to bring down component costs by a factor of 10 to hit its target cost of $1bn for a full-scale machine.
The companies say their systems will be able to tolerate a degree of imperfection in the qubits thanks to a technique known as error correction. This works by copying data between a number of qubits, creating redundancy for when any individual component fails.
So far, only Google has demonstrated a quantum chip capable of performing error correction as its size increases. According to Kelly, any company trying to scale up without first reaching this point would end up with “a very expensive machine that outputs noise, and consumes power and a lot of people’s time and engineering effort and does not provide any value at all”.
Others, however, have not slowed their attempts to scale, even though none has yet matched Google.
IBM said its sights were set on what it called the most important challenge, of showing it can operate a system at very large scale, while also questioning whether Google’s approach to error correction will work in a full-sized system.
The technique used by Google, known as surface code, works by connecting each qubit in a two-dimensional grid to its nearest neighbours. This relies on a relatively large number of qubits working together and requires the system to reach 1mn qubits or more to perform useful calculations.
Microsoft has said it decided against pursuing a similar design after deciding that trying to build 1mn-qubit machines presented too many other engineering challenges.
IBM changed course to a different form of error correction, known as a low-density parity-check code, which it claims will require 90 per cent fewer qubits than Google. However, this depends on longer connections between qubits that are further apart, a difficult technology challenge that has left it behind.
Kelly at Google said IBM’s technique added new levels of complexity to systems that were already extremely hard to control, though IBM claimed last month to have succeeded in creating longer connectors for the first time.
The latest IBM design appeared capable of producing a workable, large-scale machine, said Mark Horvath, an analyst at Gartner, though he added that its approach still only existed in theory. “They need to show they can manufacture chips that can do that,” he said.
Regardless of design, the companies face many other common engineering challenges.
These include reducing the rat’s nest of wiring found inside early quantum systems by finding new ways to link large numbers of components into single chips, and then connecting a number of chips into modules. It will also require much bigger, specialised fridges to house full-scale systems, which operate at extremely low temperatures.
Issues like these highlight basic design decisions that could be critical as the systems scale. Ones that use superconductors as qubits, such as those from Google and IBM, have shown some of the biggest advances, though their qubits are harder to control and they need to operate at temperatures close to absolute zero.
Rival systems that use atoms as qubits — known as trapped ions and neutral atoms — or those that use photons promise to be inherently more stable. But they face several other hurdles, including the difficulty of linking their clusters of qubits together into larger systems and overcoming their slower computing speed.
The costs and technical challenges of trying to scale will probably show which are more practical.
Sebastian Weidt, chief executive at Universal Quantum, a British start-up working with trapped ions, said government decisions about which technologies to back during this period would probably play a big part in narrowing investment down to “a smaller number of players who can get all the way”.
In one sign of growing official interest in sorting out the winners, Darpa, the Pentagon’s advanced research agency, last year began a broad study of different quantum companies with the aim of identifying which could be expanded to reach practical size the fastest.
Meanwhile, several companies have recently shown off radical new designs for qubits, which they say will be more controllable.
They include Amazon and Microsoft, which claim to have mastered a state of matter to create more reliable components. These technologies are at a far earlier stage of development but their backers claim they will eventually leap ahead.
That has not slowed the companies using older techniques that have been years in development. “Just because it’s hard, doesn’t mean it can’t be done,” said Horvath, echoing the confidence fuelling the industry’s race to scale.
undefined