Lawrence Livermore National Laboratory



Back to top

Back to top

person holding mask.

Climate Change Comes into Focus

Advanced computer models, simulations, and analysis capabilities help scientists zoom in on Earth system processes and improve climate research.

Every few months, a new prediction about Earth’s changing climate makes headlines. The most prominent organization making them, the 195-nation Intergovernmental Panel on Climate Change (IPCC), has been instrumental in establishing the terms of future climate conditions. In 2018, IPCC issued the following statement: “Climate-related risks for natural and human systems are higher for global warming of 1.5°C than at present, but lower than at 2°C (high confidence).” This 1.5°C threshold became the bellwether for climate policy around much of the world.

In the latest IPCC report, published in August 2021, the organization updated its stance, projecting surface temperature to rise over the 21st century under all assessed emission scenarios. By 2100, IPCC expects surface temperature to exceed 1.5°C above the global pre-industrial mean, unless measures are taken to reduce carbon dioxide emissions and greenhouse gases (GHGs).

On one level, the report is a sobering look at the potential state of the planet over the next several decades. On another, it’s a testament to the innovations made by Lawrence Livermore and others to improve the scientific tools used in climate research. The Laboratory’s Program for Climate Model Diagnosis and Intercomparison (PCMDI) played an important role in producing all six IPCC reports and is continually expanding its efforts to help reduce uncertainty in climate predictions. Livermore’s high-performance computing (HPC) capabilities are key to this work, helping scientists better understand Earth system processes and develop pattern-based analysis of climate change.

A Matter of National Interest

Earth is a wonderful, big, complicated planet. Its nearly 320-million-square kilometers of surface area is covered with oceans, mountains, cities, rainforests, deserts, and ice. Across the planet, temperatures; wind cycles; pressure systems; and varying amounts of sunlight, cloud cover, and precipitation; among other atmospheric conditions, create Earth’s weather. An area’s climate is defined by its prevailing weather conditions over long periods of time.


timeline of events
Oak Ridge National Laboratory’s Summit supercomputer was commissioned in 2018. Eight times more powerful than its predecessor Titan, Summit allows scientists to run more detailed processes and interactions within models. The Energy Exascale Earth System Model (E3SM) version 2 was validated on Summit. (Image courtesy of Oak Ridge Leadership Computing Facility.)

When predicting the weather, meteorologists typically use results of several weather models combined with their expertise to provide a forecast, usually for the week. “To create these forecasts, extensive data is collected up to the last day prior to the forecast,” says Lawrence Livermore climate scientist Steve Klein. “The data is fed into the models using varying initial conditions, with the expectation that the meteorologist will then be able to ‘predict the future’.” This process is called deterministic forecasting, and after a few days or weeks, one would know if the forecast was correct. Forecasting climate is a much trickier undertaking, with greater potential impacts on energy production, resource consumption, and human settlement patterns. In fact, the Department of Energy (DOE) recently made researching climate risks associated with the national energy system a priority.

Livermore researchers apply HPC and expertise in fundamental sciences, such as meteorology, climatology, applied mathematics, and computational science, to the problem of understanding and predicting how Earth’s systems evolve on timescales from a few years to several centuries. “We use 30-year statistical averages in climate research,” says David Bader, director of the Laboratory’s Climate Program. “However, the past 30 years are very different from the 30 years before that. We need new information that projects out 30 or 50 years to inform infrastructure design.” DOE’s Energy Exascale Earth Systems Model (E3SM) was born from the need to investigate climatological impacts and energy-relevant science using code optimized for the next-generation of DOE’s advanced supercomputers. The first DOE exascale systems, Frontier and Aurora, are scheduled to come online in the year ahead.


timeline of events
California wildfires have raged through the state over consecutive summers. In August 2020, catastrophic wildfires razed more than a million acres of land in just a little over a week. This satellite image shows the location of several fires (orange) and the resulting smoke emissions. (Image courtesy of NASA Earth Observatory.)

Building a More Detailed Model

Earth system simulation involves solving approximations of physical, chemical, and biological governing equations on spatial grids at the finest resolutions possible. This resolution depends not only on the complexity of the model, but also on the peak performance of the supercomputer used to run the simulations. “E3SM was coded from the ground up with exascale computers in mind,” says Bader. “We saw the need for a climate model that was well-suited for DOE’s mission problems and would run well on the exascale computing architectures envisioned for the national laboratories.”

The E3SM research team includes geophysical and computational scientists from multiple universities and DOE laboratories. In 2014, they began running the initial high-resolution E3SM model on Titan, at Oak Ridge National Laboratory’s (ORNL’s) Leadership Computing Facility, and Argonne National Laboratory’s Mira. These leading-edge computing systems of the time were capable of trillions of calculations per second and boasted a higher-performing architecture that decreased time to solution, allowed for increased complexity of the models, and provided more realistic simulations. The development effort considered the availability of even more powerful supercomputers in the coming years. “The data structures in older climate codes and the way the information was laid out across the computer would not work with these more advanced machines,” says Bader. “We did not start scientifically from scratch, but from a code perspective, we took a clean approach to building the model.” The new climate model was compared and run side-by-side with its predecessors to certify its validity. Bader adds, “We needed to ensure that we hadn’t made any errors.” The data comparison and open-source code were also made available internationally to build support and promote further validation.

timeline of events

E3SM version 1 debuted in 2018. It simulates the fully coupled climate system at high-resolution and combines models of global atmosphere and land surface with ocean, sea ice, and river models. These components are important to DOE’s primary research thrusts related to the energy sector: Earth’s water cycle, biogeochemistry, and cryosphere (frozen water). With E3SM, researchers can simulate precipitation and surface water conditions in specific regions, examine how external and environmental factors can alter the chemical cycles of the planet, and investigate the vulnerability of the Antarctic Ice Sheet.

With further development and access to ORNL’s Summit supercomputer, commissioned in 2018, a faster, more capable E3SM version 2 was delivered and released to the broader scientific community in September 2021. Summit’s peak performance of 200,000 trillion calculations per second makes it eight times more powerful than Titan. “The increase in computing power allows us to add detail to processes and interactions that results in more accurate and useful simulations than the previous version,” says Bader. Version 2 improves the representation of precipitation and clouds. “Specifically, how clouds change in a warmer climate is much more realistic,” says Livermore atmospheric scientist Chris Golaz. In addition, E3SM version 2 offers scientists the ability to “telescope” to improve regional predictions. Golaz adds, “With E3SM version 2, we have two fully coupled configurations: a 100-kilometer globally uniform resolution and a regionally refined model resolution with 25 kilometers over North America and 100 kilometers elsewhere.”



two scientists

E3SM version 2 offers two fully coupled configurations that improve grid-size resolution anywhere on the globe. The model shows a 100-kilometer globally uniform resolution and a 25-kilometer grid resolution over North America.

For Klein, this feature will assist him in his work researching the behavior of clouds, which have a significant effect on the amount of sunlight that reaches the planet’s surface. He says, “It doesn’t make sense to spend computational time and energy simulating clouds on the other side of Earth while researching an especially sensitive region for energy production.” With Livermore’s climate research experience and computer-science competencies, more scientists like Bader and Klein will be enabled to study climate conditions, worldwide or at a near-neighborhood level. Bader says, “E3SM version 2 allows us to more realistically simulate the present, which gives us more confidence to simulate the future.”

Climate Fingerprints

Over the last few years, California wildfires have become a devasting phenomenon. In the summertime, smoke from these seasonal wildfires drastically affects air quality in the inland valley and beyond, while the infernos themselves destroy homes and level the landscape. The 2012–2016 drought in California—the driest period in its recorded history—was only over for four years before exceedingly dry conditions returned in 2020. When droughts become regular and affect the water cycle on a large scale, a region becomes truly arid.


timeline of events

For Livermore climate scientist Céline Bonfils, long-term drought and regional aridity is of particular interest. Since 1980, California and the western United States have become drier. In contrast, during the same period, the Sahel region in Africa—the semi-arid transitional zone between the Sahara and savannas to the south—has recovered from its long dry spell. Scientists are using a technique called climate fingerprinting to better understand these changes and their cause and effect.


timeline of events
In climate fingerprinting, scientists search for a pattern of climate change (a “fingerprint”) resulting from human activities as predicted by computer models. Statistical methods are then used to measure whether observations are becoming progressively more aligned with the model-based human fingerprint over time and away from noise patterns. The process attempts to separate the relative roles of natural and human influences on global climate. Shown above are satellite observations and the human fingerprint predicted for the tropospheric temperature field. (Image provided by Ben Santer.)

"The main goal of fingerprinting research is to separate the relative roles of natural variation and human influences on global climate,” says Bonfils. “Earth’s climate is noisy in nature, but it is also impacted by several external factors that act at different paces and places, similar to how different musical instruments would contribute to a song, with their own rhythms, patterns, and notes.” In fact, for the last 30 years, now-retired Laboratory scientist Ben Santer and other Livermore climate scientists have developed and applied the fingerprint method originally conceptualized by Klaus Hasselmann, a co-recipient with Syukuro Manabe of the 2021 Nobel Prize in Physics.

Bonfils led a multi-institution research team that identified two primary mechanisms, or fingerprints, affecting temperature, precipitation, and aridity levels on a continental scale. “One can think of this work like tuning a radio and capturing two different songs playing simultaneously out of the noisy background.” The first “song,” louder and clearer than the other, mainly characterized the large-scale warming and drying effect of GHGs from burning fossil fuels. The second tune, more subtle than the first, mainly represents the cooling effect of past particulate emissions from Europe and North America, leading to different rates of warming between the two hemispheres. Together, these two mechanisms help explain the changes in aridity in many parts of the world.


timeline of events
A composite image shows sea surface temperature anomalies indicative of a La Niña event (courtesy of the National Oceanic and Atmospheric Administration’s Coral Reef Watch) blended with a NASA Blue Marble image. (Blue Marble is a NASA-generated image of Earth created from high-resolution satellite images.)

To detect these two human-related climate fingerprints, Bonfils and colleagues examined the simultaneous changes in temperature, precipitation, and the climate moisture index. The index measures aridity based on precipitation and atmospheric evaporative demand, factors that also affect water usage such as irrigation and hydropower. The team analyzed multiple, distinct global climate models simulating the historical climate in response to recent volcanic eruptions and the evolution in GHG and aerosol emissions. From these simulations, they extracted the climate fingerprints of human activities and measured whether the observations are becoming increasingly similar with the fingerprints through time.

In Bonfils’ case, she and her team used the World Climate Research Programme’s Coupled Model Intercomparison Project Phase 5 (CMIP5) framework to estimate Earth’s climate without human influence—providing an indication of what the planet would do naturally. By contrasting those simulated results with historical climate observations and simulations incorporating the human component, the researchers determined that GHGs and particulates were the most likely contributing factors. “In this study, we switched from the traditional fingerprinting technique based on linear-trend statistics to a new regression-based fingerprinting technique,” says Bonfils. “Doing so, we were able to better account for the complex temporal behaviors of forcings of the climate system, such as the slowly evolving increase in atmospheric GHGs, the complex temporal evolution of aerosol emissions, and the episodic occurrence of volcanic eruptions, and to better distinguish their individual responses.”

The Laboratory climate scientists also found a possible connection between a drying California and the wetter Sahel. Before 1980, particulate pollution—think Victorian-era London, England—increased from U.S. and European industries, cooling the Northern Hemisphere, and pushing the tropical rain belt to the south in search of warmer climes. This shift meant more precipitation on the American West, and less in the Sahel, until regulations began to reduce particulate pollution. Without as many particulates shielding sunlight, the land-rich Northern Hemisphere began to warm faster than its ocean-dominated Southern counterpart (with help from GHGs), allowing the rain belt to come back north again. As a result, this mechanism has helped wildfires continue to rage each summer in unprecedentedly dry California, while record flooding in 2020 devastated large areas of Sahel countries such as Niger and Burkina Faso.

Hiding Behind a Volcano

Climate fingerprinting was an essential component of another research study led by Santer. With colleagues from PCMDI, the Canadian Centre for Climate Modelling and Analysis, and the Massachusetts Institute of Technology, Santer set out to determine at what point human-related fingerprint patterns became detectable within certain layers of the atmosphere.


timeline of events
Shown here are trajectories from a large initial-condition ensemble study looking at annual-mean atmospheric temperature in the mid- to upper troposphere. Researchers ran 50 simulations using the Canadian Earth System Model version 2 (CanESM2, light grey) and 40 simulations using the U.S. Community Earth System Model version 1 (CESM1, light brown). Trajectories and their averages (CanESM2, black line; CESM1, dark brown line) were compared to satellite data from several sources (red, blue, and green lines). Temperature changes are expressed as departures from the model and satellite annual averages from 1979 to 1981.

The team used two large ensembles, one based on the Canadian Earth System Model version 2 (CanESM2) and the other on the U.S. Community Earth System Model version 1 (CESM1). The CanESM2 large ensemble had 50 simulations, while the CESM1 included 40 simulations. Each simulation started with slightly different spatial distributions of initial weather conditions, for example, a Monday rainstorm or a Thursday wind event in a particular location. Weather, by nature, is complex and chaotic, thus small differences in initial conditions—temperatures, winds, and humidity in places—generate different future weather patterns for the whole system that, over time, represents a plausible climate trajectory. When plotted, each set of ensemble paths forms an envelope of climate trajectories. Researchers examined the evolution of the envelopes and compared these envelopes and their averages to actual data gathered from several satellite sources.

Results from the study showed that stratospheric cooling (approximately 14 to 29 kilometers above the Earth’s surface), which is primarily due to increases in ozone-depleting substances, was first detectable between 1994 and 1996. Detecting GHG-driven warming in the troposphere (from the surface of the Earth to approximately 18 kilometers up), however, did not occur until between 1997 and 2003—thanks to a volcano. The 1991 eruption of Mt. Pinatubo in the Philippines warmed the lower stratosphere while cooling the troposphere, temporarily hiding human influences on atmospheric temperature.

The two ensembles generated by the models differed in how consistent the fingerprint detection times were when compared to the satellite data, yet the results are encouraging. “This was the first time that the large initial-condition ensemble method was used to look at the detection time for human fingerprints on the global climate,” says Santer. Although more research is needed to better understand climate response to natural variability and human-related causes, the technique is showing its worth as a tool for separating the two. “It enables us to differentiate more clearly between human-caused signals and noise, or natural internal variability in climate, and helps us better understand how and when human activities first began to affect climate.”


timeline of events

Variations Are Key

In early 2021, Livermore scientists untangled a long-standing mystery: most climate models simulated more warming in the tropical troposphere than was observed in satellite data. Previous research studying the differences between satellite observations and model simulations suggested that climate models are overly sensitive to GHG changes. In contrast, Lawrence Livermore researchers found that natural climate variations, such as episodic warming and cooling from El Niño and La Niña events, respectively, can largely explain the discrepancy.

Analyzing more than 400 simulations from the newest generation of climate models, the team found that 13 percent of simulations are in accord with satellite observations. Climate models with both small and large sensitivities to GHG changes had individual simulations in accord with the satellite observations. This finding suggested that additional factors must be at play.

Natural climate variations turned out to be an important consideration. “While models are intended to represent the average climate, its forced changes, and realistic natural variations, they can only simulate the observed timing of natural climate events—and their effect on the long-term warming trend—by chance,” says Livermore scientist Stephen Po-Chedley. The research team demonstrated that the pattern of surface temperature change and the accompanying rate of tropical tropospheric temperature change is strongly modulated by natural climate variations. Model simulations with greater-than-average tropical tropospheric warming tend to have an El Niño-like pattern of surface warming. The real world, in contrast, has had a La Niña-like pattern of surface warming over the years of interest. Simulations with this pattern exhibit reduced tropical tropospheric warming and are more likely to agree with satellite observations. These results demonstrate that climate models can simulate warming of the tropical troposphere that is consistent with observations, and that natural variability has likely reduced tropospheric warming over the satellite era. “In reconciling modeled and observed warming rates, we showed that climate sensitivity is not the sole determinant of atmospheric warming,” says Po-Chedley. “Natural variability is an important piece in the puzzle.”

Looking Forward

Earth’s physical, biological, and chemical systems are complex and interconnected. Understanding and predicting how climate affects those systems, and additionally, what drives climate change is a monumental task, requiring the tenacity and expertise of climate scientists around the world. “Climate modeling is a nonlinear problem. We can’t get the large-scale right if we don’t get the small-scale right, which is why we focus on improving model physics,” says Bader.

Despite the obstacles, Lawrence Livermore, in partnership with its sister laboratories and other collaborators, continues to advance the tools and methodologies needed to accurately predict future climate conditions. With powerful computers and an ongoing refinement process, scientists build and improve sophisticated climate models such as E3SM to zoom in on elements of climate-vulnerable U.S. infrastructure. These models also enable enhanced techniques, such as climate fingerprinting, to clarify both natural and human-related influences affecting Earth’s atmosphere.

By comparing and validating research and data over more than three decades of study and using state-of-the-art tools of this era, the Laboratory is bringing into focus the evolution of climate change and its potential effects. In the 2021 IPCC report, Valérie Masson-Delmotte, the IPCC Working Group 1 co-chair, summarized the importance of this work, stating, “We now have a much clearer picture of the past, present, and future climate, which is essential for understanding where we are headed, what can be done, and how we can prepare.”

—Ben Kennedy
(with additional reporting by Ann Parker and Anne Stark)

Key Words: Canadian Earth System Model version 2 (CanESM2), climate change, climate fingerprinting, Community Earth System Model version 1 (CESM1), Energy Exascale Earth System Model (E3SM), ensemble modeling, exascale computing, greenhouse gas (GHG), Intergovernmental Panel on Climate Change (IPCC), Program for Climate Model Diagnosis and Intercomparison (PCMDI), stratosphere, troposphere.

For further information contact David Bader at (925) 422-4843 (bader2@llnl.gov) or Céline Bonfils at (925) 423-9923 (bonfils2@llnl.gov).