Lawrence Livermore National Laboratory



Supercomputers, the laws of physics, and Lawrence Livermore’s nuclear weapons research all interact to advance atmospheric science and climate modeling.
Smoke from the 2016 Soberanes Fire in Monterey County, California—the costliest wildfire up to that time—begins to block out the Milky Way. Climate change makes droughts more likely and such fires more frequent and larger in scale. (Photo by Li Liu, M.D.)

The Atmosphere around Climate Models

Since the 1960s, computer models have been ensuring the safe return of astronauts from orbital and lunar missions by carefully predicting complicated spacecraft trajectories. A slight miscalculation could cause a craft to zoom past the Moon or Earth and become lost in space, or approach too steeply and face an equally disastrous outcome. Bruce Hendrickson, Livermore’s associate director for Computation, points out, “In the 1960s, scientists and engineers put people on the Moon with less computing power than we carry in our pockets now, whereas today’s advanced computers allow us to study phenomena vastly more complex than orbital dynamics.” That computational power offers unprecedented insight into how the physical world works, providing details about phenomena that would be infeasible to study with physical experiments. At Lawrence Livermore, numerical models running on high-performance computers are a vital part of research in many programs, including stockpile stewardship and climate studies. In fact, Livermore’s climate models trace their origins to the Laboratory’s initial development of codes to simulate nuclear weapons. Hendrickson states, “Our primary mission is nuclear weapons design, which has required us to create unique computational capabilities. These capabilities have also been applied to other national needs, including modeling the atmosphere and the rest of the climate system.”


A satellite image

On July 12, 2017, a 5,801-square-kilometer piece of the Larsen C ice shelf broke away from Antarctica, as shown in the satellite image. The Energy Exascale Earth System Model (E3SM) is one of the first to simulate the movement and evolution of glaciers and ice sheets. (Image courtesy of NASA.)

Over the years, advances in scientific understanding and increased computational power have resulted in higher fidelity climate models that are more representative of the real world. These computationally intense simulations have also helped shake down and benchmark subsequent generations of the Department of Energy’s (DOE’s) supercomputers before the machines transition to classified work. “Climate simulation is an application that can consume the whole machine and put it through its paces in a very demanding way,” explains Hendrickson. “The simulations touch every part of the computer.” (See S&TR, July/August 2015, Recalling the Origins of Stockpile Stewardship; and S&TR, July/August 2015, Stockpile Stewardship at 20 Years.)


Climate models





Climate models divide Earth into a grid with vertical and horizontal intervals. The smaller the intervals, the finer the grid, and the better the resolution of the model—that is, the greater detail the model can produce. (Image courtesy of the National Oceanic and Atmospheric Administration.)

First Atmospheric Animation

From its inception, the Laboratory pursued numerical approaches to solving problems using cutting-edge computer systems. “Livermore went all-in with computers,” says Glenn Fox, associate director for Physical and Life Sciences. “When the Laboratory’s doors opened, the first big procurement was a state-of-the-art computer.” The room-sized Univac-1 had 5,600 vacuum tubes and 9 kilobytes of memory and ran at a speed of 1,000 floating-point operations per second (flops). In 2018, Livermore’s Sierra supercomputer will use more than 1 million microprocessors to achieve a speed of 150 petaflops—150 trillion times faster than the Univac-1.

Even in the Laboratory’s early days, researchers understood that the same computational approaches for simulating nuclear weapons could be applied to better simulate evolution of the weather and for applications such as tracking releases of radioactive and other hazardous materials. In the late 1950s, Livermore scientist Cecil “Chuck” Leith developed one of Livermore’s first-ever numerical models capable of simulating the hydrodynamic and radiative processes in a thermonuclear explosion. Recognizing fundamental similarities in the underlying equations and interested in demonstrating what could be achieved with more powerful computing, Leith turned his attention to creating more comprehensive weather system models.

Michael MacCracken, a now-retired climate scientist who headed Livermore’s atmospheric and geophysical sciences division from 1987 to 1993, came to the Laboratory as one of Leith’s graduate students. MacCracken says, “Using the most advanced computers available in the early 1960s, Leith developed an atmospheric model that was way ahead of its time.” Leith’s Livermore Atmospheric Model (LAM) divided the atmosphere into a three-dimensional (3D) mesh with six vertical layers and a horizontal grid with five-degree intervals in latitude and longitude. LAM was the world’s first global atmospheric circulation model that calculated temperature, winds, humidity, clouds, precipitation, the day-and-night cycle, and weather systems around the globe, all starting from first-principles equations for the conservation of mass, momentum, energy, and water vapor. Leith also created the first animation of atmospheric modeling results by colorizing photographs of a black-and-white video screen and stitching them together into a film.

The Livermore Atmospheric Model (LAM)
The Livermore Atmospheric Model (LAM), developed by Cecil “Chuck” Leith in the 1960s, was the first-ever global climate model to be animated. Shown are screen captures of runs for pressure and precipitation (left) and temperature (right).

Leith’s atmospheric work also benefited other Livermore programs. For example, his study of atmospheric turbulence led to a better understanding of how to represent turbulence and turbulent flows. MacCracken adds, “Although simulations of astrophysics, plasma physics, and nuclear weapons address different temperatures, pressures, and timescales, it’s all the same basic physics. So computational advances in one area benefit the others and vice-versa.”

Ozone and Nuclear Winter

As environmental awareness rose in the late 1960s, Laboratory programs began to address regional and global environmental problems. Derived from LAM, an early climate model developed by MacCracken was used to analyze hypotheses about the causes of ice age cycles, the effects of volcanic eruptions and changes in land cover, and the consequences of changes in atmospheric composition. The pioneering LAM would eventually lead to the global climate models that today also encompass interactive representations of the oceans, land surfaces, ice masses, and biological activity in the oceans and on land. Parallel to these efforts, atmospheric chemistry models were also developed. The first such model contributed to a successful plan to limit rising concentrations of photochemical smog in the San Francisco Bay Area. The second model simulated stratospheric chemistry and was used to calculate the impact of a proposed fleet of supersonic transport aircraft on stratospheric ozone. This modeling also investigated the potential for ozone depletion from atmospheric nuclear testing back in the early 1960s and the much larger depletion that would result from a global nuclear war with megaton-yield nuclear weapons.

When chlorofluorocarbon (CFC) emissions from aerosol spray cans, refrigerators, and other sources came under scrutiny in the 1970s, the stratospheric chemistry model was applied to evaluate the ozone depletion potential and develop a metric for calculating depletion that was later used in the Montreal Protocol to regulate CFC emissions. After restrictions were put in place in 1987, growth of the continent-size hole in the ozone layer slowed, and after further international agreements, it eventually stopped growing and began to slowly shrink.


An atmospheric chemistry model



An atmospheric chemistry model jointly developed by Livermore and NASA predicts recovery of the ozone hole between 2000 and 2029. The left-hand panels show the extent of the ozone hole (blue shows minimum ozone), while the right-hand panels show the amount of ozone-destroying chlorine monoxide (peaking in red). Actual data (not shown) confirmed that once steps to reduce ozone-depleting gases were taken, the hole in the ozone stopped growing and began shrinking.

In the mid-1980s, famed astrophysicist Carl Sagan and others suggested the specter of a “nuclear winter”—that the blasts and fires from a global nuclear war could loft enough smoke and other matter into the atmosphere to obscure sunlight for months, causing a global vegetation die-off and a winterlike cooling of the entire planet that could kill billions of people. In 1945, before he cofounded Lawrence Livermore and later became its director from 1958 to 1960, Edward Teller had made the critical determination at Los Alamos that a nuclear explosion would not ignite the atmosphere. Now he questioned the severity of a nuclear winter. Climate scientist Curt Covey, who retired from Livermore in 2017, remembers Teller saying, “At Livermore, we have the best computers. Surely we can do the best job in simulations.” In response, Livermore used its modeling capabilities to investigate the global effects of nuclear winter and found that although significant cooling would occur depending on the amount of smoke lofted, the effects would be less severe than initially conjectured.

Building on these wide-ranging activities, Livermore was well positioned to simulate and better understand the effects on the climate of increasing concentrations of carbon dioxide (CO2) and other greenhouse gases. Fox notes, “Although assessing the impact of human activities on the environment was not part of the Laboratory’s original charter, as our capabilities developed, we stepped into a broader program that has played an important role for the country.”

Chernobyl and Tracking Releases

In 1986, the worst nuclear accident in history occurred at the Soviet Union’s Chernobyl power plant in Ukraine. A partial meltdown of the reactor’s core resulted in a massive explosion and open-air fires that belched radioactive material into the atmosphere for days. Livermore’s National Atmospheric Release Advisory Center (NARAC) had been created by DOE nearly a decade earlier as an emergency response asset and was part of DOE’s 1979 response to the radioactive release at the Three Mile Island reactor in Pennsylvania. Immediately after the Chernobyl accident, NARAC worked with subject-matter experts both inside and outside the Laboratory, quickly connecting its local and regional dispersion models to global meteorological models to estimate where the plume and fallout from Chernobyl would spread. The models, validated with measurements from different countries, helped to provide a better understanding of the impacts of the release and possible protective measures. This included analysis of the potential threats to the milk supply.

After Chernobyl, NARAC’s responsibilities were expanded to a global scale to better safeguard the nation and the world. Responses to many national and international incidents notably include the 2011 power plant accident in Fukushima, Japan. Today, NARAC is expanding its high-resolution atmospheric and transport models to span spatial scales from the worldwide transport of radiological materials to dispersion down city streets from, say, a radiological dispersal device or an accident at a chemical plant. These models incorporate highly resolved terrain and meteorological information and are used to prepare for a wide range of release scenarios, including large fires or chemical spills, incidents involving weapons of mass destruction, and nuclear power plant failures. Lee Glascoe, program leader for NARAC, says, “When we are alerted to a hazardous release, we work quickly with DOE using NARAC’s atmospheric modeling capabilities to provide decision makers with predictions of hazards associated with the plume dispersal to help protect workers and the public.”


An atmospheric release simulation



An atmospheric release simulation from Livermore’s National Atmospheric Release Advisory Center shows how a hazardous plume could disperse through streets and flow around buildings. This sophisticated and validated atmospheric model can resolve to the meter scale. The system is being developed to launch on short notice in an emergency to provide responders with actionable information needed to protect people from plume exposure.

Climate Model Intercomparisons

Over the decades, computational advances have allowed more components of the climate system to be combined into a single model. Previously, combining components such as the atmosphere, land surfaces, oceans, sea ice, aerosols, and the carbon cycle into one model was far too complex. In 1989, Livermore took the lead in an international program designed to evaluate and learn from the increasing number of climate models being developed by leading scientific organizations around the world. This effort, the Program for Climate Model Diagnosis and Intercomparison (PCMDI), was announced in a press release by Bruce Tarter, then associate director for Physics and later Laboratory director from 1994 to 2002. Tarter stated, “The greenhouse effect is of tremendous global concern. It is essential that the policymakers in the U.S. and internationally have the necessary tools to address it. Our new program will enable us to improve the scientific tool that is of extraordinary value in this effort—the computer model.” (See S&TR, June 2012, Seeking Clues to Climate Change.)

Climate models are grounded in the laws of physics, and their simulations of the historical climate are carefully compared to available global observations. This grounding allows researchers to assess many kinds of “what if” climate change scenarios. Tom Phillips has worked as a PCMDI climate scientist for decades and recognizes that the complexity of climate models can make them difficult to understand, which in turn has led to criticism. For instance, climate scientists have been accused of tweaking the models to produce desired outcomes. Phillips denies the claim, explaining, “We look at the whole system as manifested in different aspects, such as the variation in global temperature, the hydrological cycle, and atmospheric circulation. Even if it were possible to tweak parameters to achieve specific results, the tweak would affect other results, making it very apparent that something was amiss.” Parameters in the models are deeply embedded in the very equations describing physical processes, with values set according to the physics being represented. Modeling results—say, rainfall sensitivity to CO2—emerge only after the equations have been solved over the range of time covered by the models. Furthermore, results must be consistent with extensive observed data.

For almost three decades, PCMDI has been closely examining and comparing the results of climate models with observed changes in the climate system. If climate model results differ from observations or other models, scientists use this difference as an opportunity for learning. Ongoing testing and intercomparison can thus lead to improvement of all models. Although differences in modeling approaches will lead to some degree of variation among climate models and differences with normal variation of weather, multiple runs from multiple models usually reveal consistent trends when the results are combined into an ensemble.

Climate scientist Céline Bonfils, who focuses on hydrological effects such as aridity, confirms that climate models are performing quite well, adding, “The big picture is relatively well understood in terms of the global scale. For instance, the climate models already accurately simulate winter storm tracks in mid-latitudes, monsoon systems, and arid lands in subtropics. As the world warms, wet regions are tending to become wetter and dry regions drier. What we are doing now is trying to understand the details at much finer levels, on regional scales. People care about what’s happening in their backyard. They also want to know to what extent human-induced climate change might have made Hurricanes Harvey, Irma, and Maria more destructive than previous hurricanes.”

Warming near Earth's surface
Warming near Earth’s surface is shown by both (left) satellite observations and (right) climate models, specifically, for temperature change in the lower troposphere from January 1979 to December 2016. The average of historical simulations performed with 37 different climate models corresponds well with satellite temperature measurements made by Remote Sensing Systems.

PCMDI has been a major contributor to all five assessment reports by the Intergovernmental Panel on Climate Change (IPCC). After the fourth assessment, more than 40 Livermore researchers were recognized when the IPCC was co-awarded the 2007 Nobel Peace Prize for its efforts to “build up and disseminate greater knowledge about man-made climate change, and to lay the foundations for the measures that are needed to counteract such change.”

Models Meet Supercomputers

In the late 1980s, as the Cold War ended, Laboratory scientists and engineers looked for wider application of their expertise in nuclear weapons modeling and other simulations. Some researchers shifted their careers to climate modeling. Then, in 1996, when the Comprehensive Nuclear-Test-Ban Treaty ended underground testing, science-based stockpile stewardship was born. The modeling of weapons in great detail as they age is central to stockpile stewardship, and the need arose to push even harder on the boundaries of supercomputing, resulting in machines of incredible power. Behind these supercomputers is a small army of computer scientists who develop computer codes used to safeguard the nuclear weapons stockpile, as well as codes employed in climate research. Computer scientist Dean Williams explains, “You cannot use the actual Earth as an experiment. You cannot double or triple the amount of greenhouse gases in the atmosphere to see what happens to the planet. That’s an impossible option. The only way we can investigate what would happen is by using computer simulations. We simulate nuclear tests. We simulate how airplanes fly. We’ve shown this works for complex systems, with close agreement between simulations and observations. If we can do these things, why not climate? The same scientific approach and fundamental physics principles apply to all.”

The principal investigator and chair of the Earth System Grid Federation (ESGF), Williams has spent almost 30 years working with climate data. ESGF is a massive data-management system that allows researchers from all over the world to securely store and share models, analyses, and results, along with observational data from satellites and other scientific sources. (See S&TR, January/February 2013, Dealing with Data Overload in the Scientific Realm.) Williams says, “As computer scientists, we interface with climate scientists. We also work with the hardware, networking, and other software application teams. When I’m talking about climate models and moving around petabytes of data, I’m dealing with ESGF. We interface with the modelers because they’re the ones running the models. We really have to understand the terminology and the science behind the models so that we can code them and give the researchers useful results.” This teamwork also helps spread knowledge about complex modeling approaches and lessons learned throughout the Laboratory’s programs.

Connecting the Dots

A straightforward way to assess the fidelity of a model is to compare the model’s results from a decade or two ago with actual observed measurements made after the model’s results were published. Williams adds, “We did some simulations in 2007, and now, in 2017, we compared the results to observed changes 10 years later. We found that the models predicted the temperature changes we are seeing now. Just plot the data points on the graph.” Another striking result was the human impact on climate. Scientists have demonstrated that when human influences such as increased concentrations of greenhouse gases are deliberately excluded from the models, the resulting simulation predicts much colder temperatures than what is observed today. In fact, no model based on careful representation of physical laws has been able to reproduce the actual observed increase in global temperatures over recent decades without including those human effects.

Each decade since the 1960s has been warmer than the previous decade. Of the 17 hottest years on record, 16 have occurred since 2001, and the years 2014, 2015, and 2016 have consecutively set record high temperatures. These records are independently confirmed by four world-leading science institutions—NASA, the National Oceanic and Atmospheric Administration, the Japanese Meteorological Agency, and the United Kingdom’s Met Office Hadley Centre. Climate models help quantitatively explain the extent of the human contribution to this warming. Readily observable, data-based cause-and-effect relationships also help to explain how human influences are driving other changes, such as melting glaciers and ice sheets, warming oceans, and rising sea levels. Accumulated climate changes are driving what the world is experiencing today—such as seawater regularly flooding streets in Miami Beach, Florida, and Newport News, Virginia, at high tide, and the poleward movement of fisheries—adding credence to projections of future trends. The next generation of climate models promises to help us be even better informed and prepared.

Models of an Exascale Kind

The DOE Office of Science launched the Energy Exascale Earth System Model (E3SM) program in 2014, but E3SM actually dates back to a 2007 Grand Challenge award at Livermore, which provided researchers a large amount of time on the Atlas supercomputer. Using Atlas, the team ran a simulation using what was then one of the most detailed coupled models of global climate ever produced. Dave Bader, who heads E3SM, says, “We match the strengths of DOE computational science with existing research. DOE has a mission to understand the consequences of energy production and use, and obviously that includes greenhouse gas emissions. This assessment requires an Earth system model, and E3SM is a DOE model, for DOE missions, running on DOE computers.”

By adding the interactive biogeochemical process by which the climate is linked to plant life and other living organisms, global climate models have evolved into Earth system models. A multi-institutional program combining the efforts of six national laboratories and several other leading scientific organizations, E3SM will run simulations at resolutions of 15 kilometers (whereas Leith’s first model had a horizontal resolution of about 500 kilometers). The model will also be able to “telescope” to a resolution as small as 1 kilometer to focus attention on towns or other small locales. E3SM will also incorporate additional Earth system components such as ice shelves and glaciers that can flow and fracture—processes that are critical to projecting future rates of sea level rise.


A high-resolution ocean simulation





This high-resolution ocean simulation uses the Energy Exascale Earth System Model (E3SM), which divides the globe into a grid with intervals of only 15 kilometers. E3SM will run on the most advanced supercomputers and produce results of unprecedented resolution.

As part of the DOE Exascale Computing Project, supercomputer architecture is going through a radical transformation. E3SM will start running on pre-exascale supercomputers but is being designed to run on full exascale platforms. As early as December 2017, E3SM will have completed the first of its many simulations addressing important energy-related questions, such as how the availability of water resources changes over periods as short as decades, how changes in the hydrological cycle will affect energy production, and how changes in heating and cooling will affect the energy needs of infrastructure, business, and the public.

Looking Ahead

The tremendous capabilities Livermore has built in supercomputing have been applied successfully to sustain the nation’s nuclear deterrent and address other national scientific challenges. Those computational capabilities have also advanced research in related mission-critical fields. Livermore’s long, successful history of atmospheric modeling has helped identify and address a broad range of issues in Livermore’s mission space. The improvement of modeling has relentlessly continued, leading to ever more realistic representations of the world. Today, the atmospheric release models used by NARAC deal with scales from local to global as climate models are looking at the world at finer and finer scales.

As with their climate model predecessors, Earth system models cannot be perfectly predictive because of the impossibility of simulating every single atom on Earth. Chaos theory also dictates that no model can predict the exact temperature and sky conditions at a given place and time of day even a day from now, let alone 10 or 100 years in the future. However, the emerging capabilities in Earth system modeling will soon provide extraordinary insights into global trends and climate statistics about Earth’s past, present, and future, allowing society to explore what has passed and better predict and prepare for what is to come.

—Dan Linehan

Key Words: atmospheric model, climate change, climate model, Earth System Grid Federation (ESGF), Energy Exascale Earth System Model (E3SM), Intergovernmental Panel on Climate Change (IPCC), Livermore Atmospheric Model (LAM), National Atmospheric Release Advisory Center (NARAC), Nobel Peace Prize, nuclear weapon, Program for Climate Model Diagnosis and Intercomparison (PCMDI), stockpile stewardship, supercomputer.

For further information contact David Bader (925) 422-4843 (bader2@llnl.gov).