Livermore Leaps into Quantum Computing

As high-performance computing (HPC) systems evolve, the rate of progress is slowed by the physical limitations and power demands of conventional microchips. To overcome these obstacles, HPC experts currently pursue advances in parallel processing to increase overall performance and reduce energy consumption. (See S&TR, September 2016, Laying the Groundwork for Extreme-Scale Computing, March 2017, A Center of Excellence Prepares for Sierra, and July/August 2018, Interweaving Timelines for Faster Solutions.) The next step is exascale systems, which are slated to come online soon. Beyond exascale, the next frontier of the HPC continuum is quantum computing. For some computing problems—such as linear solvers and quantum simulations—the advances possible with quantum computing could be unprecedented. According to Livermore physicist Jonathan DuBois, “With HPC being central to the Laboratory’s national security mission, we are obligated to understand and advance the state of the art in computation, and the potential impact of quantum physics is enormous.”

Livermore scientists have been researching and developing quantum systems for approximately a decade, and today’s portfolio of projects reflects a multifaceted growth strategy. Funding sources include the Department of Energy’s Advanced Simulation and Computing (ASC) and Advanced Scientific Computing Research (ASCR) programs, as well as the Laboratory Directed Research and Development Program. DuBois leads the Laboratory’s Quantum Coherent Device Physics group, which focuses on connecting quantum technology with applications. He explains, “We are part of a scientific community that wants to develop the field, not simply build products.” Ongoing activities include demonstrating a fully programmable quantum computing system, improving superconducting materials and devices, deploying quantum sensors, and developing quantum algorithms. Progress has been encouraging, with Livermore recently deploying two functional quantum computing test beds.

The Quantum Lexicon

Image
Superposition is a major principle in quantum physics, occurring when particles exist in multiple states. This phenomenon allows a single quantum bit, or qubit, to represent 0, 1, or both. In contrast, a bit in classical computing can only represent 0 or 1.

At first glance, parallel computing systems such as Livermore’s existing supercomputers may seem equivalent to quantum computers in that all perform multiple operations simultaneously. However, the key difference is in how the two fundamentally tackle computing problems. A classical computer uses on/off transistors to store and process information, encoding data in binary digits, or bits, in one of two states—0 or 1. In contrast, a quantum computer operates on the principles of quantum physics, storing data in quantum bits, or qubits (pronounced “cue-bits”). Livermore’s qubits are superconducting electrical circuits that can exist in multiple simultaneous states—0, 1, or both. This principle of superposition is analogous to mathematically representing the state of a heads-or-tails coin flip whose outcome is still literally up in the air. The concept was first illustrated by Erwin Schrödinger’s famous 1935 thought experiment wherein a hypothetical unobserved cat is both alive and dead—but found to be one or the other when observed.

Using qubits for computation increases processing power exponentially. Two qubits can store data in four states concurrently—00, 01, 10, and 11. A 64-qubit quantum processing unit (QPU) is equivalent to 264 bits—16 exabits—in a classical computer. For any number (n) of qubits, a quantum computer could perform 2n operations at the same time. A classical computer would take far longer to do so—in some cases, years compared to seconds. This promising leap in computing power is possible only if the superposition state can be precisely controlled to remain coherent. Otherwise, the qubit system can generate errors as it processes information simultaneously. Coherence requires preservation of the relationship between different quantum states so that superposition results, which in turn requires that changes to qubits can be reversed.

Prolonging coherence is the key to sustaining quantum calculations. The smallest changes in the environment surrounding a qubit can cause a loss of coherence, also called decoherence, so scientists are keen to reduce interference from electromagnetic waves, temperature variations, and other variables in and around quantum hardware. Quantum computing—and quantum-coherent devices in general—therefore requires both using precisely controlled low-energy pulses to sustain superposition states and preventing other energy sources from disturbing those states. “We are working at the opposite end of the energy spectrum from explosives or galactic events,” explains Laboratory physicist Yaniv Rosen. “We are studying energies 100 million billion times fainter than the energy expended in a mosquito’s flight, down to 20 microelectronvolts.”

Decoherence and other system noise can be sources of error in quantum computing. In classical computing, error correction helps make systems fault-tolerant by ensuring reliable data delivery and reconstruction, and the viability of quantum computing also depends on achieving such goals. DuBois points out, “No one experimenting in this field has yet demonstrated successful error correction, which is analogous to break-even in nuclear fusion.”

A Supercool Facility

Image
Inside a test bed’s dilution refrigerator, gold-plated cans contain qubits, which are connected by wires to the rest of the assembly. Recirculating helium progressively cools the structure from top to bottom, with each circular plate introducing a colder phase. (Photo by Carrie Martin.)

A quantum computer does not look like a classical computer. Its refrigerator does not resemble a typical refrigeration unit, for instance. At Livermore, a quantum processor relies on superconductivity to reduce electrical resistance and interaction with the environment. The superconductive processor, consisting of particles in quantum circuits, is operated at extremely cold temperatures so that scientists can control the circuits’ quantum states. This approach may offer the best chance of achieving coherence goals. The sophisticated cooling infrastructure recirculates helium-3 and -4 isotopes through layers of increasing coldness, reducing the interior temperature to –273.1°C (0.007 kelvin). The dilution refrigerator operates under vacuum and is electrically shielded to minimize heat leaks and environmental noise.

Qubits are seated inside the refrigerator and connected to a suite of electronics that control superposition with microwave pulses. An arbitrary waveform generator provides gigahertz frequencies and amplitudes to interact with the qubit, and an oscilloscope monitors the input signals. The results of these manipulations are sent to an analog-to-digital converter to verify signal fidelity, and the calculation results are read on a standard computer. Indeed, classical computing plays an important role in calibrating, running, and maintaining the Laboratory’s quantum computing test beds. The Livermore team codes instructions for pulse shape and frequency in the Python programming language and uses HPC software to adjust designs for pulse control and other variables.

Although many of these components are available commercially, no instruction manual exists for a fully integrated quantum system. The team’s hands-on experience and troubleshooting skills grow daily as they supply their own technical support. Eric Holland, Livermore’s chief quantum systems architect, explains, “We are assembling quantum computing components in ways others have not. Livermore is blazing a trail.” For example, the Laboratory’s qubits are housed inside canisters attached to the bottom of the refrigerator. Made of gold-plated copper to prevent oxidation and maximize thermal contact in a vacuum, the cans are designed, built, and coated at the Laboratory before being sent offsite for annealing. Special shielding protects the qubits from stray magnetic fields and prevents light leaks. Each test bed holds four cans, allowing multiple experiments to run simultaneously.

This machinery is inherently fragile. Rosen says, “On paper, a quantum system design can seem perfect, but the environment really comes into play when implementing the design.” However, DuBois cites the Laboratory’s expertise in HPC, engineering, materials science, cryogenic physics, and quantum physics as a potentially game-changing combination in solving the quantum computing puzzle. “Livermore’s people have a well-defined vision of how to advance this field,” he says. The Laboratory’s two test beds are similarly assembled but serve different purposes—one for quick tests and prototyping and the other for mature experiments. The former was installed in late 2017, while the latter came online in early 2018 to support the ASCR Quantum-Enabled Simulation (AQuES) Testbed Pathfinder Program, which brings Lawrence Livermore and Lawrence Berkeley national laboratories together to pursue diverse research and development in quantum computing.

Building Better Qubits

Livermore scientists are developing quantum computing components alongside a new class of superconducting materials for low-energy regimes. These efforts span qubit design, QPU configuration, quantum chip circuitry, and quantum materials science. (See S&TR, March 2016, Taking Topological Insulators for a Quantum Spin.) Holland explains, “The design space is ripe for exploration. Our internal investments allow us to question others’ approaches.” Classical computing advancements typically focus on planar chip design, and industry’s prevailing quantum chip architecture is a two-dimensional (2D) lattice of qubits, each controlled by a separate oscillating signal input. However, both 2D and three-dimensional (3D) designs are being pursued at the Laboratory. DuBois states, “For better efficiency, we are trying to achieve the same computational power with one input port per system, not per qubit—a completely different paradigm for controlling the basic unit of a scalable quantum computer.”

Image
A Josephson junction is the key component of superconducting qubit circuitry. (top) The Laboratory’s fabrication process, shown in side view, includes precisely controlled deposition of layered materials (green, orange, yellow) on a substrate (blue). (bottom) Scanning electron microscopy shows a top view of a complete Josephson junction after deposition.

Furthermore, industry typically offers nearest-neighbor connections between qubits, which means lattice arrays increase as more neighbors per qubit are added, making the device less efficient as it grows in size. By the end of the 5-year AQuES Testbed Pathfinder Program, the Livermore team intends to stand up a working 20-qubit QPU with all-to-all connectivity—any pair or trio of qubits, and any combination of pairs or trios, will be interconnected. “This QPU size is the equivalent of a matrix of about a million squared, which is a good starting point for an HPC system to simulate,” says Holland.

The team is experimenting with several designs for manufacturing and positioning qubits, all aimed at minimizing energy loss and error rates while maximizing performance. The Laboratory’s qubits are based on Josephson junctions, in which two superconducting materials are connected by an insulating link. In an environment cooled nearly to absolute zero, this design allows current to flow between the superconductors with very little voltage applied. “Josephson junctions are the essential ingredient in superconducting QPUs,” says Holland. Using electron-beam lithography and evaporation, Josephson junctions are created with overlapping layers of aluminum and oxidation coatings deposited onto a substrate.

An effort at design improvement combines qubits in a new configuration known as a “qudit.” This highly efficient, multidimensional arrangement of qubits stores data in more than two states and with lower error rates. The larger the qudit, the more qubits it represents and the faster its calculation potential. A new, homegrown QPU design, nicknamed the quad core, begins with qubits fabricated on sapphire wafers, which are then cut into strips. Inside the high-purity aluminum core, a three-qubit strip is flanked by four qudits. This layout results in all-to-all connectivity.

Do-It-Yourself Materials

Image
Livermore’s innovative “quad core” QPU provides all-to-all connectivity among three qubits and four qudits. The design also improves resource efficiency with multipurpose ports that can be used for inputs or outputs.

Beyond serving as the building blocks of quantum computing, qubits also help Livermore researchers probe low-energy and -temperature systems in general, which in turn helps the team build better qubits. Rosen explains, “Not many institutions are investigating materials development for quantum systems. While others try to improve construction, we also strive to understand the root of the problem, such as which material properties are affected by miniscule energy changes.” For instance, Rosen studies surface defects, which behave completely differently from bulk defects. Disruption in surface-level energy is a potential source of decoherence in quantum circuits and qubits.

Livermore researchers are developing 2D quantum chips containing unique resonator geometries. A 2D resonator is a pattern of conductive material used to optimize oscillation signals, visually resembling a television antenna flattened onto a plane inside a microchip. The widths of each tine and the spaces between them affect electrical flow through the resonator. In recent experiments, Rosen used the Laboratory’s quantum computing test beds to measure surface defects in superconducting aluminum resonators. The test bed’s ultracold environment reduces electronic interference so that the team can track a single photon’s passage through a resonator pattern. The longer the photon stays inside the resonator before “ringing down,” the longer the coherence time. Rosen summarizes, “If we can store a photon indefinitely by controlling or mitigating surface defects, we can extend the quantum computing time limit.”

Another project uses Livermore’s additive manufacturing capabilities to create a 3D resonator whose conical cavity may prolong quantum states. “Sources of energy loss in 3D resonators are different than in 2D resonators,” explains Rosen. The team seeks to understand energy loss in resonator cavity areas of high current and high electric field. They are also investigating materials with high kinetic inductance, a property describing the energy stored in a superconductor’s bound electrons. Experiments run on Livermore’s test beds characterize electrical resistance along the cavity’s surface.

Image
Fabricated with the Laboratory’s photolithography equipment, an aluminum quantum chip is ready for testing. This chip contains 5 resonator layouts, each up to 1 millimeter wide. (Photo by Randy Wong.)

The superconducting 3D resonator is made of a common alloy of titanium, aluminum, and vanadium known as Ti64, often used in additive manufacturing at Livermore. The cylindrical device measures 25 millimeters in diameter and is fabricated with selective laser melting because conventional machining cannot create the special cavity shape. Investigating and testing the Ti64 and other cavity systems augment Livermore’s approach to quantum technologies. In 2018, physicist Gianpaolo Carosi led a workshop at the Laboratory’s Livermore Valley Open Campus to review the latest in cavity research. Attendees hailed from other national laboratories, international organizations, and academic institutions. He says, “Better cavities mean better qubit control. We see much synergy in developing these systems for superconducting qubits and accelerator experiments.”

In addition, Laboratory researchers are collaborating with the University of California at Berkeley to explore other types of resonator materials, such as amorphous silicon. Rosen says, “Growing crystalline materials for 2D resonators is very difficult, so we are considering different configurations of atoms in amorphous structures.” While Berkeley scientists measure defects that interact with energy vibrations, Livermore measures defects affected by electric fields. The two teams are comparing their results to learn how defects behave in amorphous materials. “This is the first attempt at correlating electrical and vibrational defects to find their origin,” states Rosen. The Laboratory’s additive manufacturing capabilities again play a role, as the two institutions contribute different fabrication methods to the project. Livermore’s multipronged efforts to solve the coherence problem are bearing fruit for several applications, such as laser amplifiers, single-photon and low-temperature microwave detectors, high-precision quantum sensors, and more compact quantum-coherent devices. The work also advances the overall state of the art in additive manufacturing.

Ultrasensitive Detection

Another important research thrust at the Laboratory is quantum sensing, which exploits quantum superposition, entanglement, and squeezing to achieve ultrahigh resolution. (Entanglement and squeezing involve multiphoton quantum states that are dependent on or correlate to each other.) Carosi notes, “We search for changes that standard detection systems cannot find, such as a tiny frequency shift amid background noise.” Applications include metrology, distributed and remote sensing, gravimetry, spectroscopy, and quantum-based imaging.

Image
The cavity of Livermore’s superconducting three-dimensional resonator was fabricated with selective laser melting from a titanium–aluminum–vanadium alloy known as Ti64. This type of resonator is used to study energy loss in superconducting materials.

Quantum computing test beds are also indispensable for the Laboratory’s contributions to the Axion Dark Matter Experiment (ADMX). Begun at Livermore and now sited at the University of Washington, the work searches for the dark matter thought to comprise nearly one-fourth of the universe’s energy density. (See S&TR, January/February 2015, Investing in Early Career Researchers.) As the project’s cospokesperson, Carosi leads a team supporting the search for theorized particles called axions. Carosi explains, “The axion is one of the most likely dark matter candidates. The ADMX system is designed to detect axions by their coupling to photons, when they convert to microwaves in the presence of a strong magnetic field.” The primary background signals come from photons produced by thermal noise, and the exceptional sensitivity of quantum-coherent devices helps find the proverbial needle in the haystack. Similar to a quantum computer, the ADMX system contains a superconducting dilution refrigerator to reduce thermal background noise, along with a large-bore magnet and a resonant cavity to boost the axion-to-photon conversion signal.

The Livermore team is testing two types of devices for the ADMX project—microwave cavity systems and amplifiers based on superconducting quantum interference devices (SQUIDs). These cavity systems—which at 1 meter long are much bigger than the Ti64 cavity—are built at the Laboratory and tested for resonance quality at different frequencies. SQUID-based amplifiers decrease background noise inside the cavity, thereby clarifying potential axion signals. The team has also combined multiple SQUID amplifiers into a “SQUIDADEL” to search multiple frequencies simultaneously. In fact, Livermore is the only facility with the equipment to test the full SQUIDADEL system. Carosi states, “The more precisely we can measure, the better chance we have to solve the universe’s mysteries.”

New Programming Paradigm

Classical computers may help run quantum computers, but classical programming is insufficient for quantum calculations. DuBois explains, “Quantum computers need a different kind of programming, which includes intimately understanding the physics of the system to be simulated and the system running the simulation.” As part of ASC’s Beyond Moore’s Law Project, Livermore scientists are developing new algorithms to run on the quantum test beds. The project’s twin goals are improvements in basic science and scientific computing. DuBois notes, “We have the potential to achieve huge increases in scientific computing speed with this technology.”

Image
Livermore researchers Nathan Woollett (left) and Gianpaolo Carosi inspect the cryostat used for testing prototype microwave cavity systems for the Axion Dark Matter Experiment. The exceptional sensitivity of quantum-coherent devices can help find the axion, considered one of the most likely dark matter candidates. (Photo by George Kitrinos.)

For the basic science goal, the team strives to efficiently simulate quantum dynamics in physical systems such as chemical reactions or atomic-level material behavior. With classical computing, simulations become more computationally expensive at smaller temporal and spatial scales, and quantum phenomena cannot be simulated directly or effectively. At best, solving quantum equations on a classical computer results in approximations, but more accurate simulations on a quantum computer could feed into larger models, improving simulation quality. Key to the scientific computing goal are new algorithms to solve applied problems—such as linear equations—and optimized functions to enable large-scale multiphysics simulations. The team is therefore focusing on mapping these algorithms to the quantum hardware and demonstrating their efficacy.

Exponential Potential

Even with an already sizeable portfolio of ongoing projects, scientists are eager to evolve the Laboratory’s quantum ecosystem even further and expand collaborations with HPC experts. For instance, DuBois’s team plans to deploy a larger, more versatile test bed. Another effort may involve distributed quantum computing, integrating the test beds with the Laboratory’s HPC systems. Other plans target the user experience, revealing the inner workings of a quantum computer so users can see how their experiments are being run. Holland adds, “If researchers want to access the system remotely, from their own office, for instance, we could enable this first at Livermore and then expand to other national laboratories and external collaborators.”

Meanwhile, work continues amid another unique challenge—finding and nurturing specialists in quantum computing. “The field requires multidisciplinary expertise and a diverse skill set,” states Holland. Few graduate programs train students to contribute to quantum computing, and mid-career staff may not consider changing paths. However, DuBois’s own career is an example of the Laboratory’s professional development opportunities in emerging science and technology in general. He points out, “I don’t know of any other institution where a theoretical computational physicist can transition to experimental quantum physics.”

Livermore embraces its pioneering role in a field Carosi calls “the Wild West.” Holland emphasizes, “Quantum computing is not a fantasy—it’s already happening. Investments are crucial for making further progress and delivering on our missions. A broad, coordinated effort is required, and Livermore is a great hub for synthesizing resources to meet this lofty but important goal.”

—Holly Auten

Key Words: additive manufacturing, algorithm, ASCR Quantum-Enabled Simulation (AQuES) Testbed Pathfinder Program, Axion Dark Matter Experiment (ADMX), coherence, high-performance computing (HPC), Josephson junction, quantum-coherent device, quantum computing, quantum processing unit (QPU), quantum sensing, qubit, qudit, superconducting quantum interference device (SQUID), superconductivity, superposition.

For further information contact Jonathan DuBois (925) 422-1406 (dubois9 [at] llnl.gov (dubois9[at]llnl[dot]gov)).