One of the greatest privileges of working directly with the world’s most powerful quantum computer at Quantinuum is building meaningful experiments that convert theory into practice. The privilege becomes even more compelling when considering that our current quantum processor – our H2 system – will soon be enhanced by Helios, a quantum computer a stunning trillion times more powerful, and due for launch in just a few months. The moment has now arrived when we can build a timeline for applications that quantum computing professionals have anticipated for decades and which are experimentally supported.
Background
In the 1980s, in the years after Richard Feynman and David Deutsch were working on their initial thoughts around quantum computing, Nicholas Cozzarelli at the University of California, Berkeley, was grappling with a biochemical riddle – “how do enzymes called topoisomerases and recombinases untangle the DNA strands that knot themselves inside cells?”
Cozarelli teamed up with mathematicians including De Witt Sumners, who recognized that these twisted strands could be modelled using the language of knots.
Knot theory’s equations let them deduce how enzymes snipped, flipped and reattached DNA, demystifying processes essential to life. Decoding the knots in DNA proved crucial to designing better antibiotics and in advancing genetic engineering.
Cozzarelli’s team took advantage of the power of knot invariants—polynomial expressions that remain consistent markers of a knot’s identity, no matter how tangled the loops become. This is just one example of how knot theory has been used to solve real-world problems of practical value.
Today, knot theory finds practical uses in fields as diverse as chemistry, robotics, fluid dynamics, and drug design. Measuring the invariants that characterize each knot is a challenge that scales exponentially with the complexity of the knots.
This work shows how a quantum computer can cut through this exponential explosion, indicating that Quantinuum's next-generation systems will offer practical quantum advantage in solving knot theory problems.
In this article, Konstantinos Meichanetzidis, a team leader from Quantinuum’s AI group, explains intriguing and valuable new research into applying quantum computers to addressing problems in knot theory.
Quantifying quantum advantage for knot theory
Quantinuum’s applied quantum algorithms team has published a historic end-to-end algorithm for solving a famous problem in knot theory, via a preprint paper on the arXiv. The research team, led by Konstantinos Meichanetzidis, also included Quantinuum researchers Enrico Rinaldi, Chris Self, Eli Chertkov, Matthew DeCross, David Hayes, Brian Neyenhuis, Marcello Benedetti, and Tuomas Laakkonen of the Massachusetts Institute of Technology.
The project was motivated by building configurable and comprehensive algorithmic tools to pinpoint quantum advantage in practice. This was done by rigorously defining time and error budgets and quantifying both the classical and quantum resource requirements necessary to meet them. Considering realistic quantum and classical processors, they predict that Quantinuum’s forthcoming quantum computers meet those requirements.
Knot theory is a field of mathematics called ‘low-dimensional topology’, with a rich history, stemming from a wild idea proposed by lord Kelvin, who conjectured that chemical elements are different knots formed by vortices in the ether. Of course, we know today that the ether theory did not hold up under experimental scrutiny, but mathematicians have been classifying and studying knots ever since. Knot theory is intrinsically linked with many aspects of physics. For example, it naturally shows up in certain spin models in statistical mechanics. Today, physical properties of knots are important in understanding the stability of macromolecular structures, from DNA and proteins, to polymers relevant to materials design. Knots find their way into cryptography. Even the magnetohydrodynamical properties of knotted magnetic fields on the surface of the sun are an important indicator of solar activity.
Most importantly for our context, knot theory has fundamental connections to quantum computation, originally outlined by Witten’s work in topological quantum field theory, concerning spacetimes without any notion of distance but only shape. In fact, this connection formed the very motivation for attempting to build topological quantum computers, where anyons – exotic quasiparticles that live in two-dimensional materials – are braided to perform quantum gates.
Konstantinos Meichanetzidis, who led the project, said: “The relation between knot theory and quantum physics is the most beautiful and bizarre fact you have never heard of.”


The fundamental problem in knot theory is distinguishing knots, or more generally, links. To this end, mathematicians have defined link invariants, which serve as ‘fingerprints’ of a link. As there are many equivalent representations of the same link, an invariant, by definition, is the same for all of them. If the invariant is different for two links then they are not equivalent. The specific invariant our team focused on is the Jones polynomial.
The mind-blowing fact here is that any quantum computation corresponds to evaluating the Jones polynomial of some link, as shown by the works of Freedman, Larsen, Kitaev, Wang, Shor, Arad, and Aharonov. It reveals that this abstract mathematical problem is truly quantum native. In particular, the problem our team tackled was estimating the Jones polynomial at the 5th root of unity. This is a well-studied case due to its relation to the infamous Fibonacci anyons, whose braiding is capable of universal quantum computation.

Building and improving on the work of Shor, Aharonov, Landau, Jones, and Kauffman, our team developed an efficient quantum algorithm that works end-to end. That is, given a link, it outputs a highly optimized quantum circuit that is readily executable on our processors and estimates the desired quantity. Furthermore, our team designed problem-tailored error detection and error mitigation strategies to achieve a higher accuracy.
In addition to providing a full pipeline for solving this problem, a major aspect of this work was to use the fact that the Jones polynomial is an invariant to introduce a benchmark for noisy quantum computers. Most importantly, this benchmark is efficiently verifiable, a rare property since for most applications, exponentially costly classical computations are necessary for verification. Given a link whose Jones polynomial is known, the benchmark constructs a large set of topologically equivalent links of varying sizes. In turn, these result in a set of circuits of varying numbers of qubits and gates, all of which should return the same answer. Thus, one can characterize the effect of noise present in a given quantum computer by quantifying the deviation of its output from the known result.

The benchmark introduced in this work allows one to identify the link sizes for which there is exponential quantum advantage in terms of time to solution against the state-of-the-art classical methods. These resource estimates indicate our next processor, Helios, with 96 qubits and at least 99.95% two-qubit gate-fidelity, is extremely close to meeting these requirements. Furthermore, Quantinuum’s hardware roadmap includes even more powerful machines that will come online by the end of the decade. Notably, an advantage in energy consumption emerges for even smaller link sizes. Meanwhile, our teams aim to continue reducing errors through improvements in both hardware and software, thereby moving deeper into quantum advantage territory.
The importance of this work, indeed the uniqueness of this work in the quantum computing sector, is its practical end-to-end approach. The advantage-hunting strategies introduced are transferable to other “quantum-easy classically hard” problems.
Our team’s efforts motivate shifting the focus toward specific problem instances rather than broad problem classes, promoting an engineering-oriented approach to identifying quantum advantage. This involves carefully considering how quantum advantage should be defined and quantified, thereby setting a high standard for quantum advantage in scientific and mathematical domains. And thus making sure we instill confidence in our customers and partners.