By Kevin Jackson for Quantinuum
The world is a lot smaller than it was in the previous century – or even in the previous decade.
Customers are now accustomed to a wide variety of products that can be delivered from distributors all over the globe. While this is a great opportunity for suppliers, it also presents a challenge in the form of supply chain, logistics, routing, and optimization.
How can distribution companies continue to serve the needs of their customers in the most efficient and effective way possible? This may seem like a simple question, but it becomes a complex computational problem when trying to account for all the variables that can occur within a distribution network.
What’s more, classical computers simply cannot adequately perform this optimization calculation in real-world scenarios. Because of the number of variables, the math just runs too slow.
That said, new work in quantum computing has shown promise in applications within the optimization field. To that end, we interviewed Quantinuum’s Megan Kohagen and Dr. Mattia Fiorentini to better understand how quantum computing could to optimized logistics and supply chains.
Kohagen and Fiorentini are participating in a panel about quantum computing at Manifest: The Future of Logistics conference this week in Las Vegas, Nevada.
When it comes to optimization it is all about maximizing or minimizing an objective. A good example is a company that delivers goods and products but owns a limited number of trucks. To improve efficiency and minimize costs, the company needs to maximize the number of objects its trucks carry and identify the shortest routes between deliveries.
“You have all these constraints, you have your objective, and you’ve got to make decisions,” said Kohagen, an optimization researcher. “The decisions end up being things like how many goods you are going to send between your distribution centers and your stores? Each of these optimization problems, even if you consider them separately, are hard problems. The technical term is that they’re (non-deterministic polynomial)-hard because you’re dealing with discrete things. For example, I can’t send half a T-shirt to my customer. I can only operate with whole integers.”
Fiorentini expands on this: “In logistics, we cannot leave anyone behind. If we need to deliver medicine, we cannot decide ‘the villages with less than 1,000 people – we don’t supply them. There are too many, and not enough people live there’. That’s not an option in today’s world.”
Today’s computers struggle to solve these NP-hard optimization problems because of the number of ever-changing variables. Consider the much-studied Traveling Salesperson Problem, which is often used to illustrate the complexity of managing logistics, routing, and supply chains.
This is a theoretical problem where a machine is tasked with finding the shortest route between an identified list of cities that a “salesperson” must visit before returning to the point of origin. This problem is simple enough with only a few cities, but it becomes exponentially harder as more locations are added, and other factors such as multiple salespeople, weather conditions, and unforeseen events arise.
Classical computers can solve this theoretical problem for a single salesperson traveling to thousands of cities. But this scenario is not realistic, and this is where classical computers begin to struggle.
“The Traveling Salesperson Problem is not very representative of what happens in the real world,” Kohagen said. “For example, with online ordering so prevalent, a retailer has orders coming in constantly. They must determine how to efficiently retrieve those items from the warehouse, pack them into the trucks, and then transport them to the customers.”
Today, the reality of an extended supply chain or distribution network is beyond what the best classical computer can solve. Quantum computers harness unique properties of quantum physics that enable them to examine all possible answers simultaneously and then concentrate the probable output of the computation onto the best option.
“Classical is a great technology, but it doesn’t cut it here,” said Fiorentini, who develops and tests quantum algorithms for optimization. “Quantum is the best alternative to classical computing that we have.“
Optimization problems have long been viewed as “killer applications” for quantum computing and research conducted by Fiorentini, Kohagen and others has begun to prove that.
Fiorentini believes it is time for decision makers to explore and invest in quantum-enabled solutions for optimization problems. “There are two decisions here for decision makers,” he said. “We either give up on the problem and say, ‘we’ll just do the best we can with a classical solution, or we start allocating a budget for really developing quantum technology.”
Quantum computing is expanding rapidly and is poised to disrupt markets such as optimization. A similar situation is the power sector, which is experiencing major disruptions due to innovations in renewable energy resources, energy storage, and regulatory reform.
Every technology has a tipping point, and all signs point to a current trend in quantum computing moving rapidly to real-world applications in optimization.
“There are a lot of algorithms being developed for optimization right now,” said Kohagen. “If you really want to advance your business with quantum methods for logistics or supply chain, this is the moment to start. Decision makers must act quickly. Those that seize the opportunity before others will have a major advantage over those who lag.”
“As quantum computers continue to scale in computational power, they’ll be able to handle increasingly complex calculations to deliver more robust and optimized supply chain solutions,” said Tony Uttley, President and COO of Quantinuum.
“We’re excited by the acceleration of our System Model H1 technologies, Powered by Honeywell. Measured in terms of qubit number as well as quantum volume, we’re meeting our commitment to increase performance by a factor of 10X each year,” he said. “Alongside other revolutionary advances such as real-time error correction, we look forward to supporting the commercialization of quantum applications that will change the way logistical challenges are met. In fact, within the coming few months we’ll be sharing more exciting news regarding our latest technological achievements.”
Want to learn about our work to develop quantum-enabled optimization solutions for companies? Contact our experts
Quantinuum, the world’s largest integrated quantum company, pioneers powerful quantum computers and advanced software solutions. Quantinuum’s technology drives breakthroughs in materials discovery, cybersecurity, and next-gen quantum AI. With over 500 employees, including 370+ scientists and engineers, Quantinuum leads the quantum computing revolution across continents.
For a novel technology to be successful, it must prove that it is both useful and works as described.
Checking that our computers “work as described” is called benchmarking and verification by the experts. We are proud to be leaders in this field, with the most benchmarked quantum processors in the world. We also work with National Laboratories in various countries to develop new benchmarking techniques and standards. Additionally, we have our own team of experts leading the field in benchmarking and verification.
Currently, a lot of verification (i.e. checking that you got the right answer) is done by classical computers – most quantum processors can still be simulated by a classical computer. As we move towards quantum processors that are hard (or impossible) to simulate, this introduces a problem: how can we keep checking that our technology is working correctly without simulating it?
We recently partnered with the UK’s Quantum Software Lab to develop a novel and scalable verification and benchmarking protocol that will help us as we make the transition to quantum processors that cannot be simulated.
This new protocol does not require classical simulation, or the transfer of a qubit between two parties. The team’s “on-chip” verification protocol eliminates the need for a physically separated verifier and makes no assumptions about the processor’s noise. To top it all off, this new protocol is qubit-efficient.
The team’s protocol is application-agnostic, benefiting all users. Further, the protocol is optimized to our QCCD hardware, meaning that we have a path towards verified quantum advantage – as we compute more things that cannot be classically simulated, we will be able to check that what we are doing is right.
Running the protocol on Quantinuum System Model H1, the team ended up performing the largest verified Measurement Based Quantum Computing (MBQC) circuit to date. This was enabled by our System Model H1’s low cross-talk gate zones, mid-circuit measurement and reset, and long coherence times. By performing the largest verified MBQC computation to date, and by verifying computations significantly larger than any others to be verified before, we reaffirm the Quantinuum Systems as best-in-class.
Particle accelerators like the LHC take serious computing power. Often on the bleeding-edge of computing technology, accelerator projects sometimes even drive innovations in computing. In fact, while there is some controversy over exactly where the world wide web was created, it is often attributed to Tim Berners-Lee at CERN, who developed it to meet the demand for automated information-sharing between scientists in universities and institutes around the world.
With annual data generated by accelerators in excess of exabytes (a billion gigabytes), tens of millions of lines of code written to support the experiments, and incredibly demanding hardware requirements, it’s no surprise that the High Energy Physics community is interested in quantum computing, which offers real solutions to some of their hardest problems. Furthermore, the HEP community is well-positioned to support the early stages of technological development: with budgets in the 10s of billions per year and tens of thousands of scientists and engineers working on accelerator and computational physics, this is a ripe industry for quantum computing to tap.
As the authors of this paper stated: “[Quantum Computing] encompasses several defining characteristics that are of particular interest to experimental HEP: the potential for quantum speed-up in processing time, sensitivity to sources of correlations in data, and increased expressivity of quantum systems... Experiments running on high-luminosity accelerators need faster algorithms; identification and reconstruction algorithms need to capture correlations in signals; simulation and inference tools need to express and calculate functions that are classically intractable”
The authors go on to state: “Within the existing data reconstruction and analysis paradigm, access to algorithms that exhibit quantum speed-ups would revolutionize the simulation of large-scale quantum systems and the processing of data from complex experimental set-ups. This would enable a new generation of precision measurements to probe deeper into the nature of the universe. Existing measurements may contain the signatures of underlying quantum correlations or other sources of new physics that are inaccessible to classical analysis techniques. Quantum algorithms that leverage these properties could potentially extract more information from a given dataset than classical algorithms.”
Our scientists have been working with a team at DESY, one of the world’s leading accelerator centers, to bring the power of quantum computing to particle physics. DESY, short for Deutsches Elektronen-Synchrotron, is a national research center for fundamental science located in Hamburg and Zeuthen, where the Center for Quantum Technologies and Applications (CQTA) is based. DESY operates, develops, and constructs particle accelerators used to investigate the structure, dynamics and function of matter, and conducts a broad spectrum of interdisciplinary scientific research. DESY employs about 3,000 staff members from more than 60 nations, and is part of the worldwide computer network to store and analyze the enormous flood of data that is produced by the LHC in Geneva.
In a recent paper, our scientists collaborated with scientists from DESY, the Leiden Institute of Advanced Computer Science (LIACS), and Northeastern University to explore using a generative quantum machine learning model, called a “quantum Boltzmann machine” to untangle data from CERN’s LHC.
The goal was to learn probability distributions relevant to high energy physics better than the corresponding classical models. The data specifically contains “particle jet events”, which describe how colliders collect data about the subatomic particles generated during the experiments.
In some cases the quantum Boltzmann machine was indeed better, compared to a classical Boltzmann machine. The team is analyzed when and why this happens, understanding better how to apply these new quantum tools in this research setting. The team also studied the effect of the data encoding into a quantum state, noting that it can have a decisive effect on the training performance. Especially enticing is that the quantum Boltzmann machine is efficiently trainable, which our scientists showed in a recent paper published in Nature Communications Physics.
Find the Quantinuum team at this year’s SC24 conference from November 17th – 22nd in Atlanta, Georgia. Meet our team at Booth #4351 to discover how Quantinuum is bridging the gap between quantum computing and high-performance compute with leading industry partners.
The Quantinuum team will be participating various events, panels and poster sessions to showcase our quantum computing technologies. Join us at the below sessions:
Panel: KAUST booth 1031
Nash Palaniswamy, Quantinuum’s CCO, will join a panel discussion with quantum vendors and KAUST partners to discuss advancements in quantum technology.
Beowulf Bash: World of Coca-Cola
This year, we are proudly sponsoring the Beowulf Bash, a unique event organized to bring the HPC community together for a night of unique entertainment! Join us at the event on Monday, November 18th, 9:00pm at the World of Coca-Cola.
Panel: Educating for a Hybrid Future: Bridging the Gap between High-Performance and Quantum Computing
Vincent Anandraj, Quantinuum’s Director of Global Ecosystem and Strategic Alliances, will moderate this panel which brings together experts from leading supercomputing centers and the quantum computing industry, including PSC, Leibniz Supercomputing Centre, IQM Quantum Computers, NVIDIA, and National Research Foundation.
Presentation: Realizing Quantum Kernel Models at Scale with Matrix Product State Simulation
Pablo Andres-Martinez, Research Scientist at Quantinuum, will present research done in collaboration with HSBC, where the team applied quantum methods to fraud detection.