Back to top
Lawrence Livermore National Laboratory is advancing quantum science research to solve increasingly complex national security challenges.
In recent years, Lawrence Livermore has increased efforts in quantum information science, a multidisciplinary field aimed at applying the scientific theories underpinning quantum mechanics to problems of extraordinary computational magnitude and precision sensing. Experts are quick to qualify hype surrounding all things “quantum”—a term referring to the nonintuitive nature of matter at its smallest, indivisible scale. Still, researchers have ample reason to be excited about the advances in theory and experimentation that the Laboratory, its research partners, and private industry have made to access and harness quantum information.
“Similar to lasers in the 1960s, quantum technologies such as quantum computing are still in search of their ‘killer app.’ What we do know is that the applications to basic and applied sciences will be broad,” says physicist and director of the Livermore Center for Quantum Science (LCQS) Kristi Beck. To continue expanding Livermore’s quantum science applications, researchers are exploring how quantum computing (QC) can solve some of the research problems that classical high-performance computing (HPC) cannot. This effort requires profound understanding of how the materials used in quantum devices impact their performance. It also requires expertise in mathematical modeling to develop new quantum-based algorithms and calculate optimal pulses for improved performance in realistic noise environments. With the launch of LCQS, the Laboratory aims to foster new connections with other quantum researchers outside Lawrence Livermore.
Processing the Heart of Matter
Lawrence Livermore’s world-class HPC systems provide a key capability for scientific research. However, these systems are not designed to handle the massive computational demand required to solve complex quantum mechanics problems quickly. Yet, the demand to solve these challenges continues to grow. Within the quantum science field, QC is most often considered the best candidate method to augment HPC systems as it applies quantum mechanics to enable more operations in the same period of time than classical 0-1 computing.
Quantum computers are not a panacea for all computational limitations; rather, they are ideal for wrangling certain complex and resource-intense calculations through new algorithms that offer exponential increases in speed, costs, and energy use on current computing platforms. Such computational challenges are commonplace at Livermore, where scientists model complex interactions between molecules, atoms, and subatomic particles with high precision. Studying physics at this scale is vital to pursuing basic science questions inherent in the mission spaces of the Department of Energy and National Nuclear Security Administration. Livermore’s interest in nuclear systems’ evolution originates from its annual charge to manage and certify the nation’s nuclear deterrent through high-accuracy modeling and simulations ranging from fundamental processes such as nuclear fission, fusion, radiation, and decay to large-scale components or systems.
“With each particle we add to a simulation, the calculations involved explode in dimension and are often a nightmare to solve,” says Livermore researcher Sofia Quaglioni, who leads the Nuclear Data and Theory group in Livermore’s Physical & Life Sciences Principal Directorate. Quaglioni and her collaborators seek to enhance data used to construct nuclear models and reaction simulations. The energy and scale regimes represented by these processes naturally extend to basic and applied research areas such as element formation and control over the fission and fusion reactions inherent in stockpile science and nuclear energy–generating technologies.
By analyzing the static and dynamic properties of atomic nuclei in terms of their constituent protons and neutrons, scientists can better calculate reaction probabilities (cross-sections) in different scenarios. Accurately simulating interactions involves complex forces beyond electromagnetism, chiefly the strong interaction ultimately responsible for binding nuclei and, at an even more fundamental level, for holding quarks together to form neutrons, protons, and other particles. “Nucleons experience a strong mutual attraction when they are a few quadrillionths of a meter apart, but once any closer, they fiercely repel each other. They interact in pairs as well as in triplets and do so on extremely short timescales. We’re still trying to figure out what these interactions look like,” says Quaglioni.
The structure of the underlying states we are trying to compute is too complicated to be captured classically.
Today, a sophisticated first-principles simulation of thermonuclear fusion could take millions of hours to compute, leaving uncertain much of the needed data for many nuclei and their reactions. QC could help to fill in this missing cross-section data for reactions involving nuclei that are too complex to be adequately probed with HPC or too short-lived to study in collider experiments. “The structure of the underlying states we are trying to compute is too complicated to be captured classically. Using QC, we obtain a more natural representation of the quantum mechanics of atomic nuclei,” says physicist Kyle Wendt. By “states,” scientists refer to the full information necessary to describe a quantum system at a given time: the particles’ relative locations, velocities, spins, and other properties described by the system’s wavefunction. The mathematical methods for calculating probabilistic interactions between particles cause computational complexity to exponentiate. Wendt explains, “Suppose we used a 10-row vector to describe a particle’s state. Describing two states requires 100 rows; for three states, 1,000 rows. Adding a single bit to a quantum computer doubles the representable information space, matching the exponential growth of the mechanics we want to describe.”
The Right Stuff
The practical realization of the advantages of QC is another challenge altogether given these systems’ susceptibility to noise sources and the extraordinary precision required when fabricating QC components. “Scientists have reached the point where we can routinely perform quantum experiments that were once revolutionary, but research and industry have yet to reach the point of wide commercial availability of the requisite hardware,” says physicist Yaniv Rosen. “At present, quantum computers are not computers as we normally picture them. They’re science experiments.” Rosen leads the Quantum Coherent Device Physics (QCDP) group, which works to translate quantum theory into full-stack hardware solutions from nanoscopic to macroscopic levels.
Quantum bits (qubits) are at the heart of quantum computers. Leveraging quantum superposition to encode many logic states, qubits’ increased state space holds exponentially more information per qubit compared to classical bits and can accelerate specific calculations. Such efficiency comes in part from the ability of qubit states to be tied together through quantum entanglement, where perturbation of one qubit instantaneously influences the state of another. To maintain the sustained coherence (the ability of a quantum system to maintain a well-defined phase relationship between different states in a superposition) necessary to entangle qubits and execute operations in quantum algorithms, qubits must remain in stable, insulated conditions. “Currently, the longest anyone can keep superconducting qubits fully entangled with each other is a matter of microseconds or even milliseconds, which is often insufficient for carrying out quantum algorithms,” says Loren Alegria, a physicist in Rosen’s group. “As we entangle more qubits, the differences in their performance and coherence times compound, and entanglement persists only as long as the weakest link allows.” The results of computation are finally revealed by measuring one or more qubits, causing their superposition of states to unravel into a single state that is probabilistically mapped onto the familiar discrete binary terms (or ternary or higher, if using qudits, a multidimensional arrangement of qubits). (See S&TR, December 2018, Livermore Leaps into Quantum Computing.)
Qubit microchips are built using traditional microfabrication techniques including metal vaporization and deposition along with optical and electron-beam patterning. The technological demands of quantum devices push existing fabrication techniques to the limits of precision, uniformity, and control to prevent structural or material imperfections from degrading qubits’ tenuously held quantum states. Materials scientist and deputy division leader of Livermore’s Materials Science Division Vince Lordi says, “Every choice of material and type of interface is consequential. Device performance can be impacted by material defects ranging from oxidation to grain boundaries and dislocations to single-atom defects.” He and his colleagues perform atomistic simulations using HPC resources to understand how properties of the materials used in quantum devices affect device fabrication and performance. Lordi says, “If we understand how the atomic-, micro-, and meso-structures of materials affect the performance of quantum computing devices, then we can identify the best materials and optimize fabrication processes to improve their performance.”
Lordi and his team also investigate the reverse: identifying the noise signatures of specific material defects amid a cacophony of noise sources that can jeopardize quantum coherence. These quantum systems often exhibit frequency-dependent noise that reflects their internal materials and structures. Lordi’s team discovers materials science- and fabrication-based routes to mitigate noise in quantum devices, including superconducting qubits built upon silicon or sapphire substrates. On the substrate surface, paths of supercooled metals (aluminum, niobium, or tantalum) form superconducting electrical circuits with zero electrical resistance. The qubit’s key functionality emerges from tiny, nonsuperconducting layers, usually oxides, inserted in the superconducting paths to form Josephson junctions. (See S&TR, December 2018, Livermore Leaps into Quantum Computing.) “Growing atomically perfect layers of oxide on metals is tricky. The details of imperfections, especially at interfaces, seem critical to the control and performance of qubits, but exactly how the imperfections influence these factors is still under investigation,” says Lordi. Oxides have multiple chemical phases with unique properties so that when erroneously formed, they can introduce identifiable quantum noise into the system.
Low-noise qubits are not the only challenge. “Quantum computers are more than a box of qubits. They contain massive amplifiers, filters, and circulators that we are integrating into a microscopic platform,” says Alegria. Livermore researchers have progressed in scaling down certain QC components to the size of microchips. These include circulators used in QC platforms to protect against electromagnetic crosstalk that could scramble qubit operations and readouts. Traditional circulators are too large to integrate into a many-qubit platform. Using emerging topological materials—whose interiors can act as insulators while their surfaces act as conductors—Livermore scientist Dongxia Qu and her team, together with university partners, developed miniaturized circulators 1,000 times smaller than previously achieved. The devices limit the spread of quantum noise introduced by the microwave signals that are used to send information between components. Control signals enter one port, and noise introduced during quantum computation is diverted out via an adjacent channel, ultimately improving user control.
Pulses and Logic
Beyond sustaining qubit coherence, to be useful classical and quantum computing systems must exchange information to enable a computing system to control and read out qubits. The exchange is accomplished by first sending microwave-frequency pulses that are timed and tuned to complement the resonant frequencies of the qubit’s oscillations. “If we don’t hit the qubit with the right frequency, it simply won’t react,” says Livermore mathematician Stefanie Günther. “Similar to pushing someone on a swing set, we need to apply energy at the right time and frequency. Otherwise, destructive interference will stop the swinging motion.” To observe the effect of a control pulse on the qubit’s energy state, a simple signal tuned to the resonant frequency of the readout resonator (a serpentine section of wire leading to the qubit) is sent through the device. Scientists then look for slight shifts in the readout pulse’s frequency spectrum to reveal changes to the qubit’s state.
Optimizing control pulses is vital for establishing qubit ground states, performing logic operations, and interpreting qubits’ reactions. Slight deviations dissipate energy into the environment, accelerating qubit decoherence if pulses are executed improperly. When qubits decohere, their superposition of states collapses prematurely into a single classical state, and additional logical operations that rely on sustained coherence can no longer be executed, and any ongoing algorithms are terminated. Günther explains, “Physicists have mostly figured out the amplitudes, frequencies, and durations of the pulses needed to switch between the zero- and one-bit states in simple cases. However, once qubits become coupled to each other and support additional energy levels, their interactions become increasingly complicated, as do the control signals required to manipulate them.” Günther and colleagues in Livermore’s Center for Applied Scientific Computing use classical computing to optimize the control pulses sent to perform specific logic gate operations. The researchers developed HPC-ready software that applies gradient-based optimization methods to discover ideal pulse shapes to prompt specific qubit behaviors.
Sophisticated classical algorithms are composed of simple Boolean logic gate components (devices that perform logical operations on one or more binary inputs and produce a single binary output) connected in sequences. In a similar manner, stringing together ideal, simple qubit control pulses in the right order can eventually achieve any desired QC operation. However, the number of required operations may become too great to execute before qubits decohere. “Using the principle of optimal control, we can reconfigure these transformations and compress operations into a single, short pulse sequence that simultaneously provides higher fidelity,” says Yujin Cho, a physicist in the QCDP group. Optimal control, a general technique to find the best operational protocol for a complex system when a huge set of solutions is possible, is used in many scientific and engineering domains ranging from dynamical systems such as space flight to plasma physics experimentation.
Günther’s team also improves how the inner workings of quantum computers are simulated on classical computers. “If we execute these optimized pulses on a quantum device, there could still be a discrepancy between the predicted and measured qubit behaviors because our classical model does not perfectly match the device dynamics. No one knows exactly what the perfect parameters are yet in these equations. They could be uncertain, or they could drift over time due to introduced errors,” says Günther. The group refines their classical model of a simulated QC system using a neural network that iteratively adjusts terms in the differential equations governing quantum device processes, thereby improving modeling of QC hardware and of control pulses in tandem.
Across the QC landscape, scientists are prolonging qubit lifetimes and executing elaborate quantum operations by tuning out destabilizing noise. “To understand what could one day be possible with hundreds or thousands of qubits, we first have to test our methods on the scales available,” says Rosen. In 2023, Livermore’s Quantum Device and Integration Testbed (QuDIT) facility installed a microchip containing 21 coupled superconducting qubits, a major step in working with larger qubit arrays and longer, more complex algorithms. Livermore researchers Juergen Biener and Xiaoxing Xia have also made strides in fabricating hardware for the trapped ion QC hardware platform, successfully 3D printing high-resolution gold-coated ion traps for atomic qubits at the Advanced Manufacturing Laboratory. Concentrating many QC resources under one roof has spurred greater onsite collaboration and resource sharing. QuDIT users can even remotely access the facility through a virtual private network and interact with quantum systems using a standard Python interface. Rosen says, “I could access the lab from my office this very second to place a qubit into superposition and obtain pure randomness, or to create an intricately entangled state of quantum objects. The impacts of these capabilities are going to be immense.”
Beyond Computing
The QC principles of superposition and entanglement are central to other emerging, quantum-enabled technologies. Quantum sensing, another subfield of quantum sciences, is at the cutting edge of detection and measurement techniques. Circulator devices are used to protect qubits from noise sources by enabling signals to be sent from cold regions to warm ones without introducing thermal noise. They also enable the pump tones, needed to operate quantum limited amplifiers to be added to the weak, desired signal with minimal noise. This signal amplification is a critical capability for researchers investigating gravitational waves, dark matter, axions, and other elusive astrophysical phenomena whose signatures in collected data are otherwise disguised by noise. (See S&TR, July 2020, Tuning into Dark Matter.) Livermore scientists have used quantum entanglement to transform fiber optics into quantum sensors in which vibrations, chemical reactions, or temperature fluctuations occurring along a fiber’s path cause detectable phase shifting and scattering of broadcast light, alerting operators to otherwise hidden events. (See S&TR, December 2023, Beneath the Surface.) Such a technique is useful for monitoring nuclear reactor and geologic operations, both instances in which early and precise detection of faint signals bolsters facility safety.
In another project, Livermore physicist Ted Laurence and his team have constructed a quantum-enabled optical microscope to probe the host-bacteria interactions involved in biofuel production. The setup performs a technique called ghost imaging by irradiating a crystal to produce pairs of entangled photons. Shuttled onto separate optical paths, one photon interacts with the imaged subject (algae) while the other proceeds to another detector. The states are entwined and the combined energy conserved, so researchers can infer an image even using photons that never directly interacted with the subject. Ghost imaging reveals more dimensions of information than a single photon stream. Laurence’s team collects high-dimensional data of 3D structures by measuring absorption, scattering, and fluorescence on a photon-by-photon basis—thus with significantly greater fidelity than tomography methods that image via vertical slicing. The microscopy technique is ideal for tracking single particles and cells as they diffuse in a solution, and it shows particular promise for medical and bioimaging applications.
In addition, quantum sensing devices are revealing the invisible. Physicist Steve Libby leads a research team that uses supercooled atoms to detect hidden objects from the objects’ gravity signatures. From basic kinematics (a branch of mechanics concerned with the motion of objects without direct reference to the forces which cause the motion), the local strength of gravity can be calculated by noting the position of a free-falling object at three points in time. Using laser cooling, a technique that slows down and traps atomic particles, and then applying further controlled laser pulses, Libby’s team places free-falling atoms into a cycle called Rabi oscillation, where the atoms regularly alternate between their ground state and an excited state, similar to an atomic clock. “Essentially, we capture a stroboscopic movie of falling atoms probed solely by carefully tuned laser light,” says Libby. Moving through a gravitational field alters the atoms’ oscillation phases, revealing regions of very small gravitational variation usually grossly overshadowed by the Earth’s gravitational field.
While gravimetry with mechanical sensors is centuries old and was used as early as the 1920s for oil discovery, the vastly improved measurement fidelity of quantum-enabled interferometers broadens its applications. “We can measure local gravitational differentials to such exquisite precision to use it as a form of tomography,” says Libby. Alongside industry partner, AOSense, Inc., his team experimentally detected gravity signatures from hidden masses. Used in conjunction with radiation detectors, portable gravitational sensors could passively identify concealed materials, avoiding the invasive and potentially disruptive effects of x-ray imaging. (See S&TR, September 2013, A Bright Idea for Microscopy.) The methods also apply belowground: Libby’s team has found that cold-atom interferometers can detect gravity signatures from anomalous structures or voids with high accuracy when contrasted with geologic data, promoting gravimetry as a valuable tool to reveal underground facilities and aid in treaty verification.
Centralizing Collaboration
Livermore is pursuing a range of quantum science research areas to leverage the potential of QC and quantum-enabled technologies. Given the field’s multidisciplinary and fast-growing nature, quantum science has lacked some of the organizational foundation of the established research communities from which it draws. “A significant language barrier exists between mathematicians, physicists, materials scientists, and computer scientists. Framing these complex problems in ways that researchers across these fields can communicate effectively is challenging,” says Günther.
Beck recognized that Livermore’s growth in quantum-related research had outgrown informal coordination for collaboration, education, and resource-sharing. Beck championed further development of LCQS, which was conceptualized by Livermore physicist Jonathan Dubois. “The center will provide internal coordination and community-building for our scientists across disciplines and directorates and serve as a base for external collaborations. Researchers at Livermore and globally are excited about emerging quantum technologies. We now have a streamlined approach to understanding what these technologies are currently capable of and where they’re going,” says Beck.
“Just as the Laboratory did with high-performance computing, if we want to make use of quantum information, we must be a part of its development. Using quantum information requires an interdisciplinary team of national laboratory experts working closely with universities and industry, which is precisely what the center will enable,” says Rosen, who is also LCQS deputy director. The organizational thrust is timely: 2025 will be the International Year of Quantum Science and Technology, marking 100 years since the birth of quantum mechanics. As institutions worldwide herald the milestone by contributing to broad quantum science research and educational initiatives, similarly, LCQS will pursue mission-driven research thrusts while growing the necessary talent base through seminars, workshops, and research traineeships for students to gain hands-on experience with QuDIT. While quantum systems are often mystifying, the efforts at Livermore to understand them at more practical levels and scales are increasingly tangible, reaffirming the Laboratory’s commitment to apply basic science understanding to evolving national security challenges.
—Elliot Jaffe
For further information, contact Kristi Beck (925) 423-4590 (beck37 [at] llnl.gov (beck37[at]llnl[dot]gov)).