Lawrence Livermore National Laboratory



Taming the Wild Frontiers of Plasma Science

Taming the Wild Frontiers of Plasma Science

Plasmas are involved in a wide range of atmospheric and astrophysical phenomena, from lightning flashes to accretion disks around black holes. Of particular interest to Livermore researchers are nonequilibrium (multi-temperature) plasma systems that exist at extreme conditions: temperatures of a million or more kelvins, pressures of a million or more times Earth’s atmosphere at sea level, and densities equivalent to those of many metals. Such plasmas occur in stellar interiors, giant planets, nuclear weapons detonations, and inertial confinement fusion (ICF) experiments. Studying these conditions on Earth, in the absence of nuclear testing, is no easy task.

Livermore physicist Frank Graziani is working to facilitate the study of plasma physics through more realistic computational models. “While investigating burn physics for programmatic applications, I began to question the fidelity of the plasma models we were using in our radiation–hydrodynamics codes,” says Graziani. “The regimes we care about are extreme and complex, and experimental data is limited with which we can compare our simulations, so we have to rely on theory and approximations.” Graziani and colleagues including Los Alamos National Laboratory’s Michael Murillo, an expert in matter at extreme conditions, and Livermore computational physicist Fred Streitz, formulated a different approach to plasma studies by creating a “virtual plasma” and probing it just as experimentalists would diagnose a real one.

A Powerful, Scalable Code

The research team explored the virtual plasma concept through the Cimarron project, an ambitious initiative funded by the Laboratory Directed Research and Development Program. The initiative was designed to predict and measure the properties of dense plasmas. Over the course of eight years, Cimarron grew to encompass collaborators from three national laboratories and five universities, including theorists for modeling, experimentalists to gather validation data, computational physicists to run simulations, and computer scientists to keep calculations running smoothly on some of the largest and most sophisticated computers available.

The Livermore-developed massively parallel molecular dynamics (MD) code ddcMD served as the backbone for the initiative. MD codes offer a different approach to modeling compared to traditional hydrodynamics codes. MD simulations follow the trajectory of each particle in a system, while hydrodynamics simulations treat particles as a fluid that flows through a mesh. MD simulations allow researchers to study the behavior of matter on nanometer (billionth-of-a-meter) and femtosecond (one-quadrillionth-of-a-second) scales, while hydrodynamics simulations are used to study matter on, at minimum, micrometer and picosecond scales—a million times larger.

Modifying ddcMD, initially created for materials science applications, for plasma studies posed a computational challenge. A plasma is a cloud of charged particles (ions and electrons) created when electrons detach from their respective atoms and molecules. The detached electrons enable the plasma to act as a whole rather than simply a cluster of individual particles. Given the significant role electrons play in plasma behavior, the code had to describe them accurately. The problem was that MD is typically used to simulate the movement of atoms, molecules, or ions, but almost never electrons. “With the Cimarron code, we broke the electrons out and let them do what they wanted,” notes Murillo, evoking the project’s name, which means “wild and untamed” in Latin-American Spanish.

To do so, the code needed to account for electrons’ short-range interactions—collisions with protons and other electrons—and their longer range interactions, particularly their attraction to and repulsion of other particles from electric charge, known as the Coulomb force. The Cimarron team incorporated functions derived from fundamental laws of quantum mechanics into ddcMD, which enabled the code to accurately calculate electron interactions at both length scales during the course of a simulation.

How the code breaks up the problem and maps the pieces to computer processors is an important consideration, as it helps determine the maximum problem size and the speed at which the problem can be solved. (See S&TR, July/August 2006, Keeping an Eye on the Prize.) “The short- and long-range interactions have a different computational character, so we map them accordingly,” notes ddcMD architect and Cimarron team member Jim Glosli.“Allocating a small piece of the machine to long range and the bulk to short range minimizes global communication cost.” The result is one of the world’s fastest and most scalable Coulomb solvers. An early simulation incorporating 2.4 billion particles run on the Laboratory’s Sequoia supercomputer earned Cimarron team members recognition as finalists for the 2009 Gordon Bell Prize for outstanding achievement in computing. (See S&TR, September 2010, Crossing Computational Frontiers.)

TThe Cimarron project aimed to understand extreme states of matter, those beyond the red 100-gigapascals pressure line shown here. This region includes plasmas featuring high temperature and density, such as those produced in National Ignition Facility (NIF) experiments and stellar interiors, and lower temperature, high-density plasmas found in giant planets such as Jupiter. By contrast, the surface of the Sun is relatively cool and dilute.







The Cimarron project aimed to understand extreme states of matter, those beyond the red 100-gigapascals pressure line shown here. This region includes plasmas featuring high temperature and density, such as those produced in National Ignition Facility (NIF) experiments and stellar interiors, and lower temperature, high-density plasmas found in giant planets such as Jupiter. By contrast, the surface of the Sun is relatively cool and dilute.

Accounting for the Physics

Once ddcMD was adapted for plasma physics, the Cimarron team began incorporating capabilities for simulating and investigating key atomic, radiative, and nuclear processes. “The plasmas we care about involve different types of ions,” explains Graziani. “At the temperatures we’re interested in, plasmas of low-Z elements (elements with low atomic numbers) such as hydrogen, have completely ionized electrons. But higher Z plasmas, such as those containing argon or silver, have only some electrons stripped off, so we have to account for their atomic physics effects.” The challenge was addressed, in part, by adding atomic physics to the code—a feature that no other MD code possesses and that is currently being used to develop a biomolecule x-ray imaging capability. (See S&TR, April/May 2016, A Virtual Laboratory for Studying Biological Structures.)

Experimental validation of ddcMD results, an important component of the Cimarron project, focused on regimes where experiments were likely to produce useful data. If the results of the experiments and the code match, it boosts confidence that the code can predict results at more extreme temperatures and densities. Using short-duration x-ray pulses at SLAC National Accelerator Laboratory’s Linac Coherent Light Source (LCLS) and short, intense laser pulses at Livermore’s Jupiter Laser Facility, scientists excited and heated graphite targets. When plasma formed, they probed the samples for information on the plasma’s properties and behavior. These data were then compared to results from the ddcMD simulations. Although the team has identified several areas where more physics could be added to enhance the code, the match was quite good. Experimentalist Stefan Hau-Riege observes, “We are now routinely using ddcMD to describe nonequilibrium conditions encountered at LCLS.”

As part of the Cimarron project, researchers used the molecular dynamics code ddcMD to study stopping power—the rate at which high-energy projectiles slow down, deposit energy, and start to heat the surrounding plasma. This simulation, run on the Laboratory’s Sequoia supercomputer, shows the stopping power of a 100-million-particle hydrogen–argon plasma. The bright colors show how much energy the projectile (black) and the plasma (gray) are exchanging. Blue regions indicate where energy is being transferred to the plasma from the projectile, and red regions suggest where energy is being transferred from the plasma to the projectile.
As part of the Cimarron project, researchers used the molecular dynamics code ddcMD to study stopping power—the rate at which high-energy projectiles slow down, deposit energy, and start to heat the surrounding plasma. This simulation, run on the Laboratory’s Sequoia supercomputer, shows the stopping power of a 100-million-particle hydrogen–argon plasma. The bright colors show how much energy the projectile (black) and the plasma (gray) are exchanging. Blue regions indicate where energy is being transferred to the plasma from the projectile, and red regions suggest where energy is being transferred from the plasma to the projectile.



The Cimarron team used the ddcMD code to simulate the structure of graphite as well as plasma formation within the material as it was exposed to intense x-ray pulses at the Linac Coherent Light Source. Shown here is the lattice structure of graphite as it evolves over 70 femtoseconds.
The Cimarron team used the ddcMD code to simulate the structure of graphite as well as plasma formation within the material as it was exposed to intense x-ray pulses at the Linac Coherent Light Source. Shown here is the lattice structure of graphite as it evolves over 70 femtoseconds.

Beyond Cimarron

The Cimarron project ended in 2014, but the pursuit of a better understanding of dense plasmas continues. At Los Alamos, Cimarron’s successor, “Nambe,” explores MD, kinetic, and hybrid approaches to plasma simulation. Efforts at Livermore focus on the application of ddcMD to increasingly complex problems. Design physicist Heather Whitley, for instance, performs dynamic ddcMD simulations of shocks for comparison with hydrodynamic codes. “We’re at the point where we can use ddcMD to directly examine problems,” she says. “Much of my current work is focused on designing National Ignition Facility (NIF) experiments to measure some of the properties that we’ve examined with ddcMD.”

Physicists Tomorr Haxhimali and Robert Rudd are using ddcMD on Lawrence Livermore’s Vulcan supercomputer to study transport processes under plasma conditions relevant to NIF’s ICF experiments. Their work is helping researchers understand how the fusion fuel mixes with the plastic shell of an ICF target capsule, an undesirable but common occurrence. Whitley has also used this simulation capability to complete a major research milestone for the National Nuclear Security Administration.

The Cimarron project and its successors have deepened scientists’ theoretical understanding of dense, nonequilibrium plasmas—an effort that benefits national security and energy research. In addition, the project has been successful in other ways. Its position at the cutting edge of computational physics and computer science has attracted a talented and varied group of collaborators from many disciplines, institutions, and levels of experience, including skilled early-career scientists. Seven postdoctoral researchers involved in the project have taken staff positions at Livermore and other national laboratories, and two Lawrence Scholars pursued their thesis work as part of the Cimarron team. Notes Murillo, “What made this project unique were the people involved. Never before have I seen a group of people from such diverse scientific backgrounds work together so well.”

—Rose Hansen

Key Words: Cimarron project, ddcMD code, hydrodynamics, inertial confinement fusion (ICF), kinetic theory, Laboratory Directed Research and Development Program, molecular dynamics (MD), National Ignition Facility (NIF), plasma.

For further information contact Frank Graziani (925) 422-4803 (graziani1@llnl.gov).