Quantinuum is uniquely known for, and has always put a premium on, demonstrating rather than merely promising breakthroughs in quantum computing.
When we unveiled the first H-Series quantum computer in 2020, not only did we pioneer the world-leading quantum processors, but we also went the extra mile. We included industry leading comprehensive benchmarking to ensure that any expert could independently verify our results. Since then, our computers have maintained the lead against all competitors in performance and transparency. Today our System Model H2 quantum computer with 56 qubits is the most powerful quantum computer available for industry and scientific research – and the most benchmarked.
More recently, in a period where we upgraded our H2 system from 32 to 56 qubits and demonstrated the scalability of our QCCD architecture, we also hit a quantum volume of over two million, and announced that we had achieved “three 9’s” fidelity, enabling real gains in fault-tolerance – which we proved within months as we demonstrated the most reliable logical qubits in the world with our partner Microsoft.
We don’t just promise what the future might look like; we demonstrate it.
Today, at Quantum World Congress, we shared how recent developments by our integrated hardware and software teams have, yet again, accelerated our technology roadmap. It is with the confidence of what we’ve already demonstrated that we can uniquely announce that by the end of this decade Quantinuum will achieve universal fully fault-tolerant quantum computing, built on foundations such as a universal fault-tolerant gate set, high fidelity physical qubits uniquely capable of supporting reliable logical qubits, and a fully-scalable architecture.
We also demonstrated, with Microsoft, what rapid acceleration looks like with the creation of 12 highly reliable logical qubits – tripling the number from just a few months ago. Among other demonstrations, we supported Microsoft to create the first ever chemistry simulation using reliable logical qubits combined with Artificial Intelligence (AI) and High-Performance Computing (HPC), producing results within chemical accuracy. This is a critical demonstration of what Microsoft has called “the path to a Quantum Supercomputer”.
Quantinuum’s H-Series quantum computers, Powered by Honeywell, were among the first devices made available via Microsoft Azure, where they remain available today. Building on this, we are excited to share that Quantinuum and Microsoft have completed integration of Quantinuum’s InQuanto™ computational quantum chemistry software package with Azure Quantum Elements, the AI enabled generative chemistry platform. The integration mentioned above is accessible to customers participating in a private preview of Azure Quantum Elements, which can be requested from Microsoft and Quantinuum.
We created a short video on the importance of logical qubits, which you can see here:
These demonstrations show that we have the tools to drive progress towards scientific and industrial advantage in the coming years. Together, we’re demonstrating how quantum computing may be applied to some of humanity’s most pressing problems, many of which are likely only to be solved with the combination of key technologies like AI, HPC, and quantum computing.
Our credible roadmap draws a direct line from today to hundreds of logical qubits - at which point quantum computing, possibly combined with AI and HPC, will outperform classical computing for a range of scientific problems.
“The collaboration between Quantinuum and Microsoft has established a crucial step forward for the industry and demonstrated a critical milestone on the path to hybrid classical-quantum supercomputing capable of transforming scientific discovery.” – Dr. Krysta Svore – Technical Fellow and VP of Advanced Quantum Development for Microsoft Azure Quantum
What we revealed today underlines the accelerating pace of development. It is now clear that enterprises need to be ready to take advantage of the progress we can see coming in the next business cycle.
The industry consensus is that the latter half of this decade will be critical for quantum computing, prompting many companies to develop roadmaps signalling their path toward error corrected qubits. In their entirety, Quantinuum’s technical and scientific advances accelerate the quantum computing industry, and as we have shown today, reveal a path to universal fault-tolerance much earlier than expected.
Grounded in our prior demonstrations, we now have sufficient visibility into an accelerated timeline for a highly credible hardware roadmap, making now the time to release an update. This provides organizations all over the world with a way to plan, reliably, for universal fully fault-tolerant quantum computing. We have shown how we will scale to more physical qubits at fidelities that support lower error rates (made possible by QEC), with the capacity for “universality” at the logical level. “Universality” is non-negotiable when making good on the promise of quantum computing: if your quantum computer isn’t universal everything you do can be efficiently reproduced on a classical computer.
“Our proven history of driving technical acceleration, as well as the confidence that globally renowned partners such as Microsoft have in us, means that this is the industry’s most bankable roadmap to universal fully fault-tolerant quantum computing,” said Dr. Raj Hazra, Quantinuum’s CEO.
Before the end of the decade, our quantum computers will have thousands of physical qubits, hundreds of logical qubits with error rates less than 10-6, and the full machinery required for universality and fault-tolerance – truly making good on the promise of quantum computing.
Quantinuum has a proven history of achieving our technical goals. This is evidenced by our leadership in hardware, software, and the ecosystem of developer tools that make quantum computing accessible. Our leadership in quantum volume and fidelity, our consistent cadence of breakthrough publications, and our collaboration with enterprises such as Microsoft, showcases our commitment to pushing the boundaries of what is possible.
We are now making an even stronger public commitment to deliver on our roadmap, ushering the industry toward the era of universal fully fault-tolerant quantum computing this decade. We have all the machinery in place for fault-tolerance with error rates around 10-6, meaning we will be able to run circuits that are millions of gates deep – putting us on a trajectory for scientific quantum advantage, and beyond.
Quantinuum, the world’s largest integrated quantum company, pioneers powerful quantum computers and advanced software solutions. Quantinuum’s technology drives breakthroughs in materials discovery, cybersecurity, and next-gen quantum AI. With over 500 employees, including 370+ scientists and engineers, Quantinuum leads the quantum computing revolution across continents.
At this year’s Q2B Silicon Valley conference from December 10th – 12th in Santa Clara, California, the Quantinuum team will be participating in plenary and case study sessions to showcase our quantum computing technologies.
Schedule a meeting with us at Q2B
Meet our team at Booth #G9 to discover how Quantinuum is charting the path to universal, fully fault-tolerant quantum computing.
Join our sessions:
Plenary: Advancements in Fault-Tolerant Quantum Computation: Demonstrations and Results
There is industry-wide consensus on the need for fault-tolerant QPU’s, but demonstrations of these abilities are less common. In this talk, Dr. Hayes will review Quantinuum’s long list of meaningful demonstrations in fault-tolerance, including real-time error correction, a variety of codes from the surface code to exotic qLDPC codes, logical benchmarking, beyond break-even behavior on multiple codes and circuit families.
Keynote: Quantum Tokens: Securing Digital Assets with Quantum Physics
Mitsui’s Deputy General Manager, Quantum Innovation Dept., Corporate Development Div., Koji Naniwada, and Quantinuum’s Head of Cybersecurity, Duncan Jones will deliver a keynote presentation on a case study for quantum in cybersecurity. Together, our organizations demonstrated the first implementation of quantum tokens over a commercial QKD network. Quantum tokens enable three previously incompatible properties: unforgeability guaranteed by physics, fast settlement without centralized validation, and user privacy until redemption. We present results from our successful Tokyo trial using NEC's QKD commercial hardware and discuss potential applications in financial services.
Quantinuum and Mitsui Sponsored Happy Hour
Join the Quantinuum and Mitsui teams in the expo hall for a networking happy hour.
Particle accelerator projects like the Large Hadron Collider (LHC) don’t just smash particles - they also power the invention of some of the world’s most impactful technologies. A favorite example is the world wide web, which was developed for particle physics experiments at CERN.
Tech designed to unlock the mysteries of the universe has brutally exacting requirements – and it is this boundary pushing, plus billion-dollar budgets, that has led to so much innovation.
For example, X-rays are used in accelerators to measure the chemical composition of the accelerator products and to monitor radiation. The understanding developed to create those technologies was then applied to help us build better CT scanners, reducing the x-ray dosage while improving the image quality.
Stories like this are common in accelerator physics, or High Energy Physics (HEP). Scientists and engineers working in HEP have been early adopters and/or key drivers of innovations in advanced cancer treatments (using proton beams), machine learning techniques, robots, new materials, cryogenics, data handling and analysis, and more.
A key strand of HEP research aims to make accelerators simpler and cheaper. A key piece of infrastructure that could be improved is their computing environments.
CERN itself has said: “CERN is one of the most highly demanding computing environments in the research world... From software development, to data processing and storage, networks, support for the LHC and non-LHC experimental programme, automation and controls, as well as services for the accelerator complex and for the whole laboratory and its users, computing is at the heart of CERN’s infrastructure.”
With annual data generated by accelerators in excess of exabytes (a billion gigabytes), tens of millions of lines of code written to support the experiments, and incredibly demanding hardware requirements, it’s no surprise that the HEP community is interested in quantum computing, which offers real solutions to some of their hardest problems.
As the authors of this paper stated: “[Quantum Computing] encompasses several defining characteristics that are of particular interest to experimental HEP: the potential for quantum speed-up in processing time, sensitivity to sources of correlations in data, and increased expressivity of quantum systems... Experiments running on high-luminosity accelerators need faster algorithms; identification and reconstruction algorithms need to capture correlations in signals; simulation and inference tools need to express and calculate functions that are classically intractable.”
The HEP community’s interest in quantum computing is growing. In recent years, their scientists have been looking carefully at how quantum computing could help them, publishing a number of papers discussing the challenges and requirements for quantum technology to make a dent (here’s one example, and here’s the arXiv version).
In the past few months, what was previously theoretical is becoming a reality. Several groups published results using quantum machines to tackle something called “Lattice Gauge Theory”, which is a type of math used to describe a broad range of phenomena in HEP (and beyond). Two papers came from academic groups using quantum simulators, one using trapped ions and one using neutral atoms. Another group, including scientists from Google, tackled Lattice Gauge Theory using a superconducting quantum computer. Taken together, these papers indicate a growing interest in using quantum computing for High Energy Physics, beyond simple one-dimensional systems which are more easily accessible with classical methods such as tensor networks.
We have been working with DESY, one of the world’s leading accelerator centers, to help make quantum computing useful for their work. DESY, short for Deutsches Elektronen-Synchrotron, is a national research center that operates, develops, and constructs particle accelerators, and is part of the worldwide computer network used to store and analyze the enormous flood of data that is produced by the LHC in Geneva.
Our first publication from this partnership describes a quantum machine learning technique for untangling data from the LHC, finding that in some cases the quantum approach was indeed superior to the classical approach. More recently, we used Quantinuum System Model H1 to tackle Lattice Gauge Theory (LGT), as it’s a favorite contender for quantum advantage in HEP.
Lattice Gauge Theories are one approach to solving what are more broadly referred to as “quantum many-body problems”. Quantum many-body problems lie at the border of our knowledge in many different fields, such as the electronic structure problem which impacts chemistry and pharmaceuticals, or the quest for understanding and engineering new material properties such as light harvesting materials; to basic research such as high energy physics, which aims to understand the fundamental constituents of the universe, or condensed matter physics where our understanding of things like high-temperature superconductivity is still incomplete.
The difficulty in solving problems like this – analytically or computationally – is that the problem complexity grows exponentially with the size of the system. For example, there are 36 possible configurations of two six-faced dice (1 and 1 or 1 and 2 or 1and 3... etc), while for ten dice there are more than sixty million configurations.
Quantum computing may be very well-suited to tackling problems like this, due to a quantum processor’s similar information density scaling – with the addition of a single qubit to a QPU, the information the system contains doubles. Our 56-qubit System Model H2, for example, can hold quantum states that require 128*(2^56) bits worth of information to describe (with double-precision numbers) on a classical supercomputer, which is more information than the biggest supercomputer in the world can hold in memory.
The joint team made significant progress in approaching the Lattice Gauge Theory corresponding to Quantum Electrodynamics, the theory of light and matter. For the first time, they were able study the full wavefunction of a two-dimensional confining system with gauge fields and dynamical matter fields on a quantum processor. They were also able to visualize the confining string and the string-breaking phenomenon at the level of the wavefunction, across a range of interaction strengths.
The team approached the problem starting with the definition of the Hamiltonian using the InQuanto software package, and utilized the reusable protocols of InQuanto to compute both projective measurements and expectation values. InQuanto allowed the easy integration of measurement reduction techniques and scalable error mitigation techniques. Moreover, the emulator and hardware experiments were orchestrated by the Nexus online platform.
In one section of the study, a circuit with 24 qubits and more than 250 two-qubit gates was reduced to a smaller width of 15 qubits thanks our unique qubit re-use and mid-circuit measurement automatic compilation implemented in TKET.
This work paves the way towards using quantum computers to study lattice gauge theories in higher dimensions, with the goal of one day simulating the full three-dimensional Quantum Chromodynamics theory underlying the nuclear sector of the Standard Model of particle physics. Being able to simulate full 3D quantum chromodynamics will undoubtedly unlock many of Nature’s mysteries, from the Big Bang to the interior of neutron stars, and is likely to lead to applications we haven’t yet dreamed of.