Quantum Volume Testing: Setting the Steady Pace to Higher Performing Devices

May 11, 2022

When it comes to completing the statistical tests and other steps necessary for calculating quantum volume, few people have as much as experience as Dr. Charlie Baldwin.

Baldwin, a lead physicist at Quantinuum, and his team have performed the tests numerous times on three different H-Series quantum computers, which have set six industry records for measured quantum volume since 2020.

Quantum volume is a benchmark developed by IBM in 2019 to measure the overall performance of a quantum computer regardless of the hardware technology. (Quantinuum builds trapped ion systems).

Baldwin’s experience with quantum volume prompted him to share what he’s learned and suggest ways to improve the benchmark in a peer-reviewed paper published this week in Quantum.

“We’ve learned a lot by running these tests and believe there are ways to make quantum volume an even stronger benchmark,” Baldwin said.

We sat down with Baldwin to discuss quantum volume, the paper, and the team’s findings.

How is quantum volume measured? What tests do you run?

Quantum volume is measured by running many randomly constructed circuits on a quantum computer and comparing the outputs to a classical simulation. The circuits are chosen to require random gates and random connectivity to not favor any one architecture. We follow the construction proposed by IBM to build the circuits.

What does quantum volume measure? Why is it important?

In some sense, quantum volume only measures your ability to run the specific set of random quantum volume circuits. That probably doesn’t sound very useful if you have some other application in mind for a quantum computer, but quantum volume is sensitive to many aspects that we believe are key to building more powerful devices.

Quantum computers are often built from the ground up. Different parts—for example, single- and two-qubit gates—have been developed independently over decades of academic research. When these parts are put together in a large quantum circuit, there’re often other errors that creep in and can degrade the overall performance. That’s what makes full-system tests like quantum volume so important; they’re sensitive to these errors.

Increasing quantum volume requires adding more qubits while simultaneously decreasing errors. Our quantum volume results demonstrate all the amazing progress Quantinuum has made at upgrading our trapped-ion systems to include more qubits and identifying and mitigating errors so that users can expect high-fidelity performance on many other algorithms.

You’ve been running quantum volume tests since 2020. What is your biggest takeaway?

I think there’re a couple of things I’ve learned. First, quantum volume isn’t an easy test to run on current machines. While it doesn’t necessarily require a lot of qubits, it does have fairly demanding error requirements. That’s also clear when comparing progress in quantum volume tests across different platforms, which researchers at Los Alamos National Lab did in a recent paper.

Second, I’m always impressed by the continuous and sustained performance progress that our hardware team achieves. And that the progress is actually measurable by using the quantum volume benchmark.

The hardware team has been able to push down many different error sources in the last year while also running customer jobs. This is proven by the quantum volume measurement. For example, H1-2 launched in Fall 2021 with QV=128. But since then, the team has implemented many performance upgrades, recently achieving QV=4096 in about 8 months while also running commercial jobs.

What are the key findings from your paper?

The paper is about four small findings that when put together, we believe, give a clearer view of the quantum volume test.

First, we explored how compiling the quantum volume circuits scales with qubit number and, also proposed using arbitrary angle gates to improve performance—an optimization that many companies are currently exploring.

Second, we studied how quantum volume circuits behave without errors to better relate circuit results to ideal performance.

Third, we ran many numerical simulations to see how the quantum volume test behaved with errors and constructed a method to efficiently estimate performance in larger future systems.

Finally, and I think most importantly, we explored what it takes to meet the quantum volume threshold and what passing it implies about the ability of the quantum computer, especially compared to the requirements for quantum error correction.

What does it take to “pass” the quantum volume threshold?

Passing the threshold for quantum volume is defined by the results of a statistical test on the output of the circuits called the heavy output test. The result of the heavy output test—called the heavy output probability or HOP—must have an uncertainty bar that clears a threshold (2/3).

Originally, IBM constructed a method to estimate that uncertainty based on some assumptions about the distribution and number of samples. They acknowledged that this construction was likely too conservative, meaning it made much larger uncertainty estimates than necessary.

We were able to verify this with simulations and proposed a different method that constructed much tighter uncertainty estimates. We’ve verified the method with numerical simulations. The method allows us to run the test with many fewer circuits while still having the same confidence in the returned estimate.

How do you think the quantum volume test can be improved?

Quantum volume has been criticized for a variety of reasons, but I think there’s still a lot to like about the test. Unlike some other full-system tests, quantum volume has a well-defined procedure, requires challenging circuits, and sets reasonable fidelity requirements.

However, it still has some room for improvement. As machines start to scale up, runtime will become an important dimension to probe. IBM has proposed a metric for measuring run time of quantum volume tests (CLOPS). We also agree that the duration of the computation is important but that there should also be tests that balance run time with fidelity, sometimes called ‘time-to-solution.”

Another aspect that could be improved is filling the gap between when quantum volume is no longer feasible to run—at around 30 qubits—and larger machines. There’s recent work in this area that will be interesting to compare to quantum volume tests.

You presented these findings to IBM researchers who first proposed the benchmark. How was that experience?

It was great to talk to the experts at IBM. They have so much knowledge and experience on running and testing quantum computers. I’ve learned a lot from their previous work and publications.

There is a lot of debate about quantum volume and how long it will be a useful benchmark. What are your thoughts?

The current iteration of quantum volume definitely has an expiration date. It’s limited by our ability to classically simulate the system, so being unable to run quantum volume actually is a goal for quantum computing development. Similarly, quantum volume is a good measuring stick for early development.

Building a large-scale quantum computer is an incredibly challenging task. Like any large project, you break the task up into milestones that you can reach in a reasonable amount of time.

It's like if you want to run a marathon. You wouldn’t start your training by trying to run a marathon on Day 1. You’d build up the distance you run every day at a steady pace. The quantum volume test has been setting our pace of development to steadily reach our goal of building ever higher performing devices.

About Quantinuum

Quantinuum, the world’s largest integrated quantum company, pioneers powerful quantum computers and advanced software solutions. Quantinuum’s technology drives breakthroughs in materials discovery, cybersecurity, and next-gen quantum AI. With over 500 employees, including 370+ scientists and engineers, Quantinuum leads the quantum computing revolution across continents. 

Blog
April 11, 2025
Quantinuum’s partnership with RIKEN bears fruit

Last year, we joined forces with RIKEN, Japan's largest comprehensive research institution, to install our hardware at RIKEN’s campus in Wako, Saitama. This deployment is part of RIKEN’s project to build a quantum-HPC hybrid platform consisting of high-performance computing systems, such as the supercomputer Fugaku and Quantinuum Systems.  

Today, a paper published in Physical Review Research marks the first of many breakthroughs coming from this international supercomputing partnership. The team from RIKEN and Quantinuum joined up with researchers from Keio University to show that quantum information can be delocalized (scrambled) using a quantum circuit modeled after periodically driven systems.  

"Scrambling" of quantum information happens in many quantum systems, from those found in complex materials to black holes.  Understanding information scrambling will help researchers better understand things like thermalization and chaos, both of which have wide reaching implications.

To visualize scrambling, imagine a set of particles (say bits in a memory), where one particle holds specific information that you want to know. As time marches on, the quantum information will spread out across the other bits, making it harder and harder to recover the original information from local (few-bit) measurements.

While many classical techniques exist for studying complex scrambling dynamics, quantum computing has been known as a promising tool for these types of studies, due to its inherently quantum nature and ease with implementing quantum elements like entanglement. The joint team proved that to be true with their latest result, which shows that not only can scrambling states be generated on a quantum computer, but that they behave as expected and are ripe for further study.

Thanks to this new understanding, we now know that the preparation, verification, and application of a scrambling state, a key quantum information state, can be consistently realized using currently available quantum computers. Read the paper here, and read more about our partnership with RIKEN here.  

partnership
All
technical
All
Blog
April 4, 2025
Why is everyone suddenly talking about random numbers? We explain.

In our increasingly connected, data-driven world, cybersecurity threats are more frequent and sophisticated than ever. To safeguard modern life, government and business leaders are turning to quantum randomness.

What is quantum randomness, and why should you care?

The term to know: quantum random number generators (QRNGs).

QRNGs exploit quantum mechanics to generate truly random numbers, providing the highest level of cryptographic security. This supports, among many things:

  • Protection of personal data
  • Secure financial transactions
  • Safeguarding of sensitive communications
  • Prevention of unauthorized access to medical records

Quantum technologies, including QRNGs, could protect up to $1 trillion in digital assets annually, according to a recent report by the World Economic Forum and Accenture.

Which industries will see the most value from quantum randomness?

The World Economic Forum report identifies five industry groups where QRNGs offer high business value and clear commercialization potential within the next few years. Those include:

  1. Financial services
  2. Information and communication technology
  3. Chemicals and advanced materials
  4. Energy and utilities
  5. Pharmaceuticals and healthcare

In line with these trends, recent research by The Quantum Insider projects the quantum security market will grow from approximately $0.7 billion today to $10 billion by 2030.

When will quantum randomness reach commercialization?

Quantum randomness is already being deployed commercially:

  • Early adopters use our Quantum Origin in data centers and smart devices.
  • Amid rising cybersecurity threats, demand is growing in regulated industries and critical infrastructure.

Recognizing the value of QRNGs, the financial services sector is accelerating its path to commercialization.

  • Last year, HSBC conducted a pilot combining Quantum Origin and post-quantum cryptography to future-proof gold tokens against “store now, decrypt-later” (SNDL) threats.
  • And, just last week, JPMorganChase made headlines by using our quantum computer for the first successful demonstration of certified randomness.

On the basis of the latter achievement, we aim to broaden our cybersecurity portfolio with the addition of a certified randomness product in 2025.

How is quantum randomness being regulated?

The National Institute of Standards and Technology (NIST) defines the cryptographic regulations used in the U.S. and other countries.

  • NIST’s SP 800-90B framework assesses the quality of random number generators.
  • The framework is part of the FIPS 140 standard, which governs cryptographic systems operations.
  • Organizations must comply with FIPS 140 for their cryptographic products to be used in regulated environments.

This week, we announced Quantum Origin received NIST SP 800-90B Entropy Source validation, marking the first software QRNG approved for use in regulated industries.

What does NIST validation mean for our customers?

This means Quantum Origin is now available for high-security cryptographic systems and integrates seamlessly with NIST-approved solutions without requiring recertification.

  • Unlike hardware QRNGs, Quantum Origin requires no network connectivity, making it ideal for air-gapped systems.
  • For federal agencies, it complements our "U.S. Made" designation, easing deployment in critical infrastructure.
  • It adds further value for customers building hardware security modules, firewalls, PKIs, and IoT devices.

The NIST validation, combined with our peer-reviewed papers, further establishes Quantum Origin as the leading QRNG on the market.  

--

It is paramount for governments, commercial enterprises, and critical infrastructure to stay ahead of evolving cybersecurity threats to maintain societal and economic security.

Quantinuum delivers the highest quality quantum randomness, enabling our customers to confront the most advanced cybersecurity challenges present today.

technical
All
Blog
March 28, 2025
Being Useful Now – Quantum Computers and Scientific Discovery

The most common question in the public discourse around quantum computers has been, “When will they be useful?” We have an answer.

Very recently in Nature we announced a successful demonstration of a quantum computer generating certifiable randomness, a critical underpinning of our modern digital infrastructure. We explained how we will be taking a product to market this year, based on that advance – one that could only be achieved because we have the world’s most powerful quantum computer.

Today, we have made another huge leap in a different domain, providing fresh evidence that our quantum computers are the best in the world. In this case, we have shown that our quantum computers can be a useful tool for advancing scientific discovery.

Understanding magnetism

Our latest paper shows how our quantum computer rivals the best classical approaches in expanding our understanding of magnetism. This provides an entry point that could lead directly to innovations in fields from biochemistry, to defense, to new materials. These are tangible and meaningful advances that will deliver real world impact.

To achieve this, we partnered with researchers from Caltech, Fermioniq, EPFL, and the Technical University of Munich. The team used Quantinuum’s System Model H2 to simulate quantum magnetism at a scale and level of accuracy that pushes the boundaries of what we know to be possible.

As the authors of the paper state:

“We believe the quantum data provided by System Model H2 should be regarded as complementary to classical numerical methods, and is arguably the most convincing standard to which they should be compared.”

Our computer simulated the quantum Ising model, a model for quantum magnetism that describes a set of magnets (physicists call them ‘spins’) on a lattice that can point up or down, and prefer to point the same way as their neighbors. The model is inherently “quantum” because the spins can move between up and down configurations by a process known as “quantum tunneling”.  

Gaining material insights

Researchers have struggled to simulate the dynamics of the Ising model at larger scales due to the enormous computational cost of doing so. Nobel laureate physicist Richard Feynman, who is widely considered to be the progenitor of quantum computing, once said, “it is impossible to represent the results of quantum mechanics with a classical universal device.” When attempting to simulate quantum systems at comparable scales on classical computers, the computational demands can quickly become overwhelming. It is the inherent ‘quantumness’ of these problems that makes them so hard classically, and conversely, so well-suited for quantum computing.

These inherently quantum problems also lie at the heart of many complex and useful material properties. The quantum Ising model is an entry point to confront some of the deepest mysteries in the study of interacting quantum magnets. While rooted in fundamental physics, its relevance extends to wide-ranging commercial and defense applications, including medical test equipment, quantum sensors, and the study of exotic states of matter like superconductivity.  

Instead of tailored demonstrations that claim ‘quantum advantage’ in contrived scenarios, our breakthroughs announced this week prove that we can tackle complex, meaningful scientific questions difficult for classical methods to address. In the work described in this paper, we have proved that quantum computing could be the gold standard for materials simulations. These developments are critical steps toward realizing the potential of quantum computers.

With only 56 qubits in our commercially available System Model H2, the most powerful quantum system in the world today, we are already testing the limits of classical methods, and in some cases, exceeding them. Later this year, we will introduce our massively more powerful 96-qubit Helios system - breaching the boundaries of what until recently was deemed possible.

technical
All