LIVERMORE’S BlueGene/L, the world’s most powerful computer, was designed for big jobs. Balancing a checkbook does not require a massively parallel supercomputer, but trying to discern what happened right after the big bang most certainly does.
Quarks, the building blocks of all nuclear material, ran free for about 10-millionths of a second after the big bang. Then, as the universe began to expand and cool, quarks coalesced into protons and neutrons, held together by massless gluons. Since the big bang about 13.7 billion years ago, quarks have never been on their own, except for a few brief moments in a particle accelerator at Brookhaven National Laboratory and during the occasional cataclysmic cosmic-ray collision.
The experiments at Brookhaven, where gold ions were blasted apart into individual quarks, are the driver behind the computational effort to re-create the phase transition from quarks to larger particles. Some of the particles are common ones, such as neutrons and protons, and others are more exotic. Physicist Ron Soltz, who led Livermore’s participation in the experiments at Brookhaven, says, “We succeeded in freeing the constituent quarks very briefly. However, what we observed after the experiment was not what we expected to see.” Researchers thought they would find a hot, energized gas—or plasma—in which particles did not interact. Instead, they found strong particle interactions. “Our current models are only partially successful in explaining what we observed,” says Soltz.
Livermore researchers working with theoretical physicists from around the world are using BlueGene/L to fill in the missing information about the phase transition. They are applying mathematics and the basic laws of physics to explain and extend results from the accelerator experiments at Brookhaven. Their calculations of the conditions surrounding the transition from quarks to larger particles are based on the theory of quantum chromodynamics (QCD). (See the box below.)
QCD was developed in the 1970s to explain how a vast array of particles could arise from only a few types of quarks, which are forever trapped inside. QCD describes the “strong force,” whimsically termed color (chroma in Greek), that prevents quarks from running free except under extreme conditions. QCD has been called “the most perfect physical theory” by one its coinventors because of its broad scope and simplicity.
In the late 1970s, famed physicist Richard Feynman designed a network for an early supercomputer and used it for QCD calculations. QCD theoretical physicists at Columbia University in New York City have long had a close working relationship with supercomputer designers at IBM’s nearby T. J. Watson Research Laboratory. QCD theorist Pavlos Vranas came to Livermore from IBM, where he was a member of the core architecture team for the Blue Gene line of supercomputers and lead designer of the Blue Gene network.
It is no coincidence, then, that a massively parallel computer is the ideal tool for simulating the interaction of quarks and other nuclear particles. BlueGene/L’s processors communicate among themselves in much the same way that quarks and gluons interact in lattice QCD, the most accurate methodology for simulating the strong force. Lattice QCD defines space–time as a four-dimensional grid of points connected with links.
The 2006 Gordon Bell Prize for Special Achievement was awarded to the BlueGene/L supercomputer and quantum chromodynamics project team led by Vranas and Soltz. The team demonstrated the supercomputer’s performance in a simulation that proved to be an ideal match between the demands of lattice QCD calculations with the computing capabilities of BlueGene/L. A version of lattice QCD written especially for BlueGene/L showed that the simulation scales almost perfectly. Lattice QCD can run on the full machine using all 131,072 processors, just as efficiently as it does using 1,024.
In a lattice QCD simulation, the lattice is broken into smaller pieces, or sublattices. “Each processor works on a sublattice, and the processors are connected by a network,” says Vranas. “For optimal operation, we needed to code the machine carefully.” In October 2006, their coding produced a sustained operating speed of 70.4 trillion floating-point operations per second (teraflops), an extraordinarily fast performance.
Because BlueGene/L has many users, QCD researchers can routinely use only a percentage of its capacity. The typical simulation volume on BlueGene/L is equivalent to just a few proton diameters, or a few quadrillionths of a meter. However, simulations using 10 percent of the machine’s processors could reproduce essentially the full range of thermodynamic QCD behavior. “The QCD community is thrilled with the results that are possible today,” says Soltz.
Thanks to BlueGene/L, Livermore has become the epicenter of QCD simulations. Soltz, an experimental physicist, is now a part-time theoretical physicist. He and theorists Vranas and Thomas Luu are participating in several collaborations that use BlueGene/L to simulate both the QCD physics that occurred at the time of the very hot big bang and the low-energy QCD particle interactions in the cold universe of today. “BlueGene/L is the most important member of any QCD collaboration,” says Soltz.
A Lattice in Space
Small perturbations can be calculated in studying a high-energy system, such as the quark–gluon plasma before the big bang. However, challenges arise in a lower-energy system, for example just below the temperature at which quarks coalesced into larger particles. In this latter regime, the perturbations are too far from a known solution to be readily calculated. Lattice QCD, a nonperturbative method, has greatly facilitated the calculations by better simulating both high- and low-energy interactions.
QCD simulations model a box of finite space, defined by the dimensions of the lattice. The quarks live on the lattice points, and interactions occur along the grid lines between the points. When a continuum theory with continuous space–time is represented by a four-dimensional lattice, showing the quarks on the lattice is tricky. The traditional way of dealing with this so-called fermion doubling problem required using a lattice with a large number of lattice points. However, the necessary lattice size increased the amount of computing time beyond anything possible.
Ten years ago, Vranas performed the first numerical simulations that used a new method called domain-wall fermions, which introduced a fifth dimension to the lattice. The residual effects of resolving the doubling problem become smaller as the number of lattice points along the fifth dimension increase. Remarkably, this calculational improvement is achieved with computing cost increasing only linearly as the lattice size increases.
Hot and Cold Calculations
Simulations to date by the QCD community indicate that the transition occurred when the temperature of the universe dropped below 2 trillion degrees, or 170 million electronvolts (megaelectronvolts). A single electronvolt is a very small measure of energy, equal to 1 volt times the charge of a single electron. “We think we’re accurate to within about 20 megaelectronvolts,” says Soltz. “The HotQCD collaboration is trying to obtain greater accuracy, to within just a few megaelectronvolts.”
Initial QCD equation-of-state calculations began in February 2007 using 10 percent of BlueGene/L. The HotQCD collaboration uses two methods to calculate the equation of state. The methods agree well, and the team’s results show that errors related to a particular method are small. The team is also developing validation techniques for comparing simulations with experimental data from RHIC and from the Large Hadron Collider, the world’s largest particle accelerator scheduled to come on line this year near Geneva, Switzerland.
Luu is applying lattice QCD to better understand the interactions of subatomic particles at low energies and temperatures. He is part of the Nuclear Physics with Lattice QCD collaboration, made up of nuclear physicists from around the world. This collaboration has produced the first predictions of scattering lengths for several combinations of particles, including pion–pion and pion–kaon. Scattering lengths are an indicator of the interactions occurring between the particles. These particles have a very short half-life and are difficult to measure experimentally before they disappear.
The collaboration is now focused on calculating the structure and interactions of the lightest nuclear particles—neutrons and protons, also called nucleons. “A lot of experimental data describe two nucleons,” says Luu, “but no body of data exists for the interaction of three nucleons.” Determining the scattering of these particles is more difficult because the signal-to-noise ratio in low-energy lattice QCD calculations is low. “We need larger calculations to get good data,” says Luu. “BlueGene/L makes it possible.”
Luu and his collaborators recently completed the first fully dynamic lattice QCD determination of the scattering of a nucleon and a hyperon, a more exotic particle. “This area is virgin territory,” says Luu. “It’s a fun wave to be on.”
Bigger Is Better
The most powerful accelerator today is the Tevatron at the Department of Energy’s Fermilab in Illinois. The machine is 10 times more energetic than the RHIC accelerator, reaching energies of almost 2 trillion electronvolts, or 2 teraelectronvolts. Fermilab is searching for the Higgs boson, a particle that theory predicts but has proved elusive experimentally. Later this year, the Large Hadron Collider’s 27-kilometer circular accelerator will produce the highest energies yet, in the range of 14 teraelectronvolts. Who knows what strange and wonderful bits of matter this powerful machine will reveal?
Some speculate that the Large Hadron Collider may demonstrate a strongly interacting theory such as QCD but with a different number of colors and flavors and a host of entirely new particles. Because this new theory would be similar to QCD, lattice methods could be used for simulations using larger supercomputers, successors to BlueGene/L.
The need for more computing power is never-ending. Soltz, Vranas, and Luu pounced when a competition call went out from Computing in Science and Engineering last year to describe what could be done with a quadrillion flops, or 1 petaflops, of computing power. The team’s brief essay, “Simulating the Birth of the Universe on a Petaflop,” explained the additional parameters and greater accuracy that more computing power would allow them to include in QCD simulations. Their winning essay appeared in the November–December 2007 issue of the magazine.
“Petaflops machines are not far away,” says Vranas. “With bigger computers, we will be able to model bigger boxes of space with smaller lattice spacing. That means more accuracy. However, we will still be limited in the kinds of questions we can answer. Learning how the universe operates at the quantum level is important, and we will always need larger and more powerful computers for these explanations.”
Key Words: big bang, BlueGene/L, Gordon Bell Prize, Large Hadron Collider, lattice quantum chromodynamics (QCD), quark–gluon plasma, Relativistic Heavy-Ion Collider (RHIC), strong force.
For further information contact Ron Soltz (925) 423-2647 (email@example.com).
Lawrence Livermore National Laboratory
Privacy & Legal Notice | UCRL-TR-52000-08-1/2 | January 10, 2008