Particle accelerators like the LHC take serious computing power. Often on the bleeding-edge of computing technology, accelerator projects sometimes even drive innovations in computing. In fact, while there is some controversy over exactly where the world wide web was created, it is often attributed to Tim Berners-Lee at CERN, who developed it to meet the demand for automated information-sharing between scientists in universities and institutes around the world.
With annual data generated by accelerators in excess of exabytes (a billion gigabytes), tens of millions of lines of code written to support the experiments, and incredibly demanding hardware requirements, it’s no surprise that the High Energy Physics community is interested in quantum computing, which offers real solutions to some of their hardest problems. Furthermore, the HEP community is well-positioned to support the early stages of technological development: with budgets in the 10s of billions per year and tens of thousands of scientists and engineers working on accelerator and computational physics, this is a ripe industry for quantum computing to tap.
As the authors of this paper stated: “[Quantum Computing] encompasses several defining characteristics that are of particular interest to experimental HEP: the potential for quantum speed-up in processing time, sensitivity to sources of correlations in data, and increased expressivity of quantum systems... Experiments running on high-luminosity accelerators need faster algorithms; identification and reconstruction algorithms need to capture correlations in signals; simulation and inference tools need to express and calculate functions that are classically intractable”
The authors go on to state: “Within the existing data reconstruction and analysis paradigm, access to algorithms that exhibit quantum speed-ups would revolutionize the simulation of large-scale quantum systems and the processing of data from complex experimental set-ups. This would enable a new generation of precision measurements to probe deeper into the nature of the universe. Existing measurements may contain the signatures of underlying quantum correlations or other sources of new physics that are inaccessible to classical analysis techniques. Quantum algorithms that leverage these properties could potentially extract more information from a given dataset than classical algorithms.”
Our scientists have been working with a team at DESY, one of the world’s leading accelerator centers, to bring the power of quantum computing to particle physics. DESY, short for Deutsches Elektronen-Synchrotron, is a national research center for fundamental science located in Hamburg and Zeuthen, where the Center for Quantum Technologies and Applications (CQTA) is based. DESY operates, develops, and constructs particle accelerators used to investigate the structure, dynamics and function of matter, and conducts a broad spectrum of interdisciplinary scientific research. DESY employs about 3,000 staff members from more than 60 nations, and is part of the worldwide computer network to store and analyze the enormous flood of data that is produced by the LHC in Geneva.
In a recent paper, our scientists collaborated with scientists from DESY, the Leiden Institute of Advanced Computer Science (LIACS), and Northeastern University to explore using a generative quantum machine learning model, called a “quantum Boltzmann machine” to untangle data from CERN’s LHC.
The goal was to learn probability distributions relevant to high energy physics better than the corresponding classical models. The data specifically contains “particle jet events”, which describe how colliders collect data about the subatomic particles generated during the experiments.
In some cases the quantum Boltzmann machine was indeed better, compared to a classical Boltzmann machine. The team is analyzed when and why this happens, understanding better how to apply these new quantum tools in this research setting. The team also studied the effect of the data encoding into a quantum state, noting that it can have a decisive effect on the training performance. Especially enticing is that the quantum Boltzmann machine is efficiently trainable, which our scientists showed in a recent paper published in Nature Communications Physics.
Quantinuum, the world’s largest integrated quantum company, pioneers powerful quantum computers and advanced software solutions. Quantinuum’s technology drives breakthroughs in materials discovery, cybersecurity, and next-gen quantum AI. With over 500 employees, including 370+ scientists and engineers, Quantinuum leads the quantum computing revolution across continents.
Last year, we joined forces with RIKEN, Japan's largest comprehensive research institution, to install our hardware at RIKEN’s campus in Wako, Saitama. This deployment is part of RIKEN’s project to build a quantum-HPC hybrid platform consisting of high-performance computing systems, such as the supercomputer Fugaku and Quantinuum Systems.
Today, a paper published in Physical Review Research marks the first of many breakthroughs coming from this international supercomputing partnership. The team from RIKEN and Quantinuum joined up with researchers from Keio University to show that quantum information can be delocalized (scrambled) using a quantum circuit modeled after periodically driven systems.
"Scrambling" of quantum information happens in many quantum systems, from those found in complex materials to black holes. Understanding information scrambling will help researchers better understand things like thermalization and chaos, both of which have wide reaching implications.
To visualize scrambling, imagine a set of particles (say bits in a memory), where one particle holds specific information that you want to know. As time marches on, the quantum information will spread out across the other bits, making it harder and harder to recover the original information from local (few-bit) measurements.
While many classical techniques exist for studying complex scrambling dynamics, quantum computing has been known as a promising tool for these types of studies, due to its inherently quantum nature and ease with implementing quantum elements like entanglement. The joint team proved that to be true with their latest result, which shows that not only can scrambling states be generated on a quantum computer, but that they behave as expected and are ripe for further study.
Thanks to this new understanding, we now know that the preparation, verification, and application of a scrambling state, a key quantum information state, can be consistently realized using currently available quantum computers. Read the paper here, and read more about our partnership with RIKEN here.
In our increasingly connected, data-driven world, cybersecurity threats are more frequent and sophisticated than ever. To safeguard modern life, government and business leaders are turning to quantum randomness.
The term to know: quantum random number generators (QRNGs).
QRNGs exploit quantum mechanics to generate truly random numbers, providing the highest level of cryptographic security. This supports, among many things:
Quantum technologies, including QRNGs, could protect up to $1 trillion in digital assets annually, according to a recent report by the World Economic Forum and Accenture.
The World Economic Forum report identifies five industry groups where QRNGs offer high business value and clear commercialization potential within the next few years. Those include:
In line with these trends, recent research by The Quantum Insider projects the quantum security market will grow from approximately $0.7 billion today to $10 billion by 2030.
Quantum randomness is already being deployed commercially:
Recognizing the value of QRNGs, the financial services sector is accelerating its path to commercialization.
On the basis of the latter achievement, we aim to broaden our cybersecurity portfolio with the addition of a certified randomness product in 2025.
The National Institute of Standards and Technology (NIST) defines the cryptographic regulations used in the U.S. and other countries.
This week, we announced Quantum Origin received NIST SP 800-90B Entropy Source validation, marking the first software QRNG approved for use in regulated industries.
This means Quantum Origin is now available for high-security cryptographic systems and integrates seamlessly with NIST-approved solutions without requiring recertification.
The NIST validation, combined with our peer-reviewed papers, further establishes Quantum Origin as the leading QRNG on the market.
--
It is paramount for governments, commercial enterprises, and critical infrastructure to stay ahead of evolving cybersecurity threats to maintain societal and economic security.
Quantinuum delivers the highest quality quantum randomness, enabling our customers to confront the most advanced cybersecurity challenges present today.
The most common question in the public discourse around quantum computers has been, “When will they be useful?” We have an answer.
Very recently in Nature we announced a successful demonstration of a quantum computer generating certifiable randomness, a critical underpinning of our modern digital infrastructure. We explained how we will be taking a product to market this year, based on that advance – one that could only be achieved because we have the world’s most powerful quantum computer.
Today, we have made another huge leap in a different domain, providing fresh evidence that our quantum computers are the best in the world. In this case, we have shown that our quantum computers can be a useful tool for advancing scientific discovery.
Our latest paper shows how our quantum computer rivals the best classical approaches in expanding our understanding of magnetism. This provides an entry point that could lead directly to innovations in fields from biochemistry, to defense, to new materials. These are tangible and meaningful advances that will deliver real world impact.
To achieve this, we partnered with researchers from Caltech, Fermioniq, EPFL, and the Technical University of Munich. The team used Quantinuum’s System Model H2 to simulate quantum magnetism at a scale and level of accuracy that pushes the boundaries of what we know to be possible.
As the authors of the paper state:
“We believe the quantum data provided by System Model H2 should be regarded as complementary to classical numerical methods, and is arguably the most convincing standard to which they should be compared.”
Our computer simulated the quantum Ising model, a model for quantum magnetism that describes a set of magnets (physicists call them ‘spins’) on a lattice that can point up or down, and prefer to point the same way as their neighbors. The model is inherently “quantum” because the spins can move between up and down configurations by a process known as “quantum tunneling”.
Researchers have struggled to simulate the dynamics of the Ising model at larger scales due to the enormous computational cost of doing so. Nobel laureate physicist Richard Feynman, who is widely considered to be the progenitor of quantum computing, once said, “it is impossible to represent the results of quantum mechanics with a classical universal device.” When attempting to simulate quantum systems at comparable scales on classical computers, the computational demands can quickly become overwhelming. It is the inherent ‘quantumness’ of these problems that makes them so hard classically, and conversely, so well-suited for quantum computing.
These inherently quantum problems also lie at the heart of many complex and useful material properties. The quantum Ising model is an entry point to confront some of the deepest mysteries in the study of interacting quantum magnets. While rooted in fundamental physics, its relevance extends to wide-ranging commercial and defense applications, including medical test equipment, quantum sensors, and the study of exotic states of matter like superconductivity.
Instead of tailored demonstrations that claim ‘quantum advantage’ in contrived scenarios, our breakthroughs announced this week prove that we can tackle complex, meaningful scientific questions difficult for classical methods to address. In the work described in this paper, we have proved that quantum computing could be the gold standard for materials simulations. These developments are critical steps toward realizing the potential of quantum computers.
With only 56 qubits in our commercially available System Model H2, the most powerful quantum system in the world today, we are already testing the limits of classical methods, and in some cases, exceeding them. Later this year, we will introduce our massively more powerful 96-qubit Helios system - breaching the boundaries of what until recently was deemed possible.