Study Seeks to Unite High-Performance Computing, Quantum Computing for Science
August 28, 2024 -- A study by more than a dozen scientists at the Department of Energy’s Oak Ridge National Laboratory examines potential strategies to integrate quantum computing with the world’s most powerful supercomputing systems in the pursuit of science.
The study published in Future Generation Computing Systems takes a big-picture look at the states of quantum computing and classical high-performance computing, or HPC, and describes a potential framework for boosting traditional scientific HPC by leveraging the quantum approach.
“It’s kind of a manifesto for how we propose to dive as a laboratory into this new era of computing,” said co-author Rafael Ferreira da Silva, a senior research scientist for ORNL’s National Center for Computational Sciences, or NCCS. “Our approach won’t be the only right way, but we think it will be a useful one that builds on ORNL’s legacy as a leader in supercomputing and that we can adapt as technology evolves and the next generation of computing takes shape.”
ORNL serves as home to the Oak Ridge Leadership Computing Facility, or OLCF, which houses Frontier, the world’s fastest supercomputer, and to the OLCF Quantum Computing User Program, which awards time on privately owned quantum processors around the country to support independent quantum study. The laboratory also leads the DOE’s Quantum Science Center, a national Quantum Information Science Research Center, which combines resources and expertise from national laboratories, universities and industry partners to investigate quantum computing, quantum sensing and quantum materials.
“We have a vast amount of experience here at ORNL in standing up classical supercomputers, dating back more than 20 years,” said Tom Beck, the study’s lead author, who oversees the NCCS Science Engagement Section. “How can we apply that experience and maintain that momentum as we explore this new quantum domain?”
Classical computers store information in bits equal to either 0 or 1. In other words, a classical bit, like a light switch, exists in one of two states: on or off. That binary dynamic doesn’t necessarily fit some complex scientific problems.
“We encounter certain problems in science in which electrons, for example, are coupled between atoms in ways that grow exponentially when we try to model them on a classical computer,” Beck said. “We can adjust formulas and try to tackle those problems in an abbreviated fashion, but we can’t even begin to hope to solve them on a classical computer. The necessary equations and computations are just too complex.”
Quantum computing uses the laws of quantum mechanics to store information in qubits, the quantum equivalent of bits. Qubits can exist in more than one state simultaneously via quantum superposition, which allows qubits to carry more information than classical bits.
“The quantum aspect allows us to represent the problem in a more efficient way and potentially opens up a new way to solve problems that we couldn’t before,” Beck said.
Scientists haven’t yet settled on the most effective technology for encoding qubits, and high error rates remain an obstacle to harnessing quantum computing’s potential. The study proposes developing quantum test beds to explore the various technologies and coupling those test beds with classical machines.
“We don’t want to tie ourselves to any single technology yet because we don’t know what approach will emerge as the best,” Beck said. “But while we’re in this early stage, we need to begin incorporating quantum elements into our computing infrastructure with an eye toward potential breakthroughs. Ultimately, we want to connect these two vastly different types of computers in a seamless way to run the machines together — similar to the hybrid architecture of graphics processing units, or GPUs, and central processing units, or CPUs, that accelerates current leadership-class supercomputers.”
That hybrid architecture, used by supercomputers such as Frontier, integrates the two kinds of processors on each node for the fastest possible computing — GPUs for the repetitive calculations that make up the backbone of most simulations and CPUs for higher-level tasks such as retrieving information and executing other instructions. The technology needed for classical and quantum processors to share space on a node doesn’t yet exist.
The study recommends a high-speed network as the best way to connect classical HPC resources with quantum computers for now.
“There are degrees of integration, and we won’t achieve the ideal right away,” said ORNL’s Sarp Oral, who oversees the NCCS Advanced Technologies Section. “To achieve that ideal, we need to identify which algorithms and applications can take advantage of quantum computing. Our job is to provide better ways to conduct science, and quantum computing can be a tool that serves that purpose.”