IBM Doubles Its Quantum Computing Volume
IBM announced at CES 2020 that its newest 28-qubit quantum computer, Raleigh, achieved the company’s goal of doubling its Quantum Volume.
Last year IBM declared that in order to achieve quantum advantage within the next decade, the company will need to at least double the Quantum Volume of its quantum computing systems every year. At CES, the company announced that it has added its fourth data point, a new 28-qubit backend, Raleigh, to its progress roadmap and achieved a system demonstrating Quantum Volume of 32.
Quantum Volume (QV) is a hardware-agnostic metric that IBM defined to measure the performance of a real quantum computer. The higher the Quantum Volume, the more real-world, complex problems quantum computers can potentially solve, such as those explored by IBM's quantum network organizations. Quantum Volume takes into account the number of qubits, connectivity, and gate and measurement errors. Material improvements to underlying physical hardware, such as increases in coherence times, reduction of device crosstalk, and software circuit compiler efficiency, can point to measurable progress in Quantum Volume, as long as all improvements happen at a similar pace.
Raleigh draws on an improved hexagonal lattice connectivity structure developed in Intel’s 53-qubit quantum computer, and features improved coherence aspects. According to IBM, the lattice connectivity had an impact on reduced gate errors and exposure to crosstalk.
IBM says the achievement of QV 32 is significant, not just because it is another point on the curve, but because it confirms that quantum systems have matured into a new phase in which developmental improvements will drive better and better experimental quantum computing platforms to enable serious research, and bridge toward Quantum Advantage.
Since IBM deployed the first system with five qubits in 2016, the company has progressed to a family of 16-qubit systems, 20-qubit systems, and (most recently) the first 53-qubit system. Within these families of systems, roughly demarcated by the number of qubits (internally we code-name the individual systems by city names, and the development threads as different birds), IBM has chosen a few to drive generations of learning cycles (Canary, Albatross, Penguin, and Hummingbird).
To hit this latest Quantum Volume milestone, IBM combined elements of learning developed along the generational development threads, together with new ideas from research. Last year IBM demonstrated advances in single-qubit coherence, pushing greater than 10 million quality factor on isolated devices. Through iteration and test, IBM started to implement similar techniques with its most advanced integration structures in the larger deployment devices.
Raleigh, IBM's newest 28Q backend in the Falcon family, follows the hexagonal lattice structure of the 53-qubit Rochester. Along with some of the upgrades IBM has been building into the later-generation Penguin devices, it sends us across the QV 32 threshold for the first time.
With Raleigh, IBM says it has improved upon coherence aspects over some early devices, but the company also sees promising new directions and processes under test that has just begun to explore on a new development system device.
IBM believes that the next ten years will be the decade of quantum systems, and the emergence of a real hardware ecosystem that will provide the foundation for improving coherence, gates, stability, cryogenics components, integration, and packaging.
Google’s Quantum Supremacy was an important quantum computing event. However, it was a singular achievement accomplished by a single company, without an immediate impact on industry applications or society. Quantum Advantage, on the other hand, is even more important than Quantum Supremacy. Several technical obstacles remain to be solved before Quantum Advantage arrives. As demonstrated by IBM’s steady progress, Quantum Volume could help us get there.
Quantum Advantage will likely begin with multiple companies announcing breakthroughs for different applications. Then, more and more companies will start running previously untouchable and complex algorithms in a hybrid environment of both classical and quantum computing.