Understanding the Changing Climate

Back to top

Back to top

Computer simulation-generated image of Earth in space.
As part of the Energy Exascale Earth System Model (E3SM), project, researchers have created the Simple Cloud-Resolving E3SM Atmosphere Model (SCREAM) to improve climate predictions. SCREAM simulations, such as the one depicted here, show unprecedented degrees of detail by using smaller grid scales for image resolution. SCREAM is capable of simulating images that are indistinguishable from satellite images. (Image created by Alex Kuhn and Niklas Roeber [NVIDIA] using Omniverse & IndeX graphics packages.)

Ensuring researchers and policymakers can predict and   prepare for the long-term effects of a changing climate is a central, motivating question for the Energy Exascale Earth System Model (E3SM) project, a multi-laboratory initiative pooling Department of Energy (DOE) talent and computing resources to forecast global climate trends. (See S&TR, September 2021, Climate Change Comes into Focus.) Climate researchers are taking advantage of DOE’s emerging exascale capabilities by constructing higher-resolution models and using faster simulation run-times to enhance their predictions. As the lead institution for E3SM, Lawrence Livermore provides scientific and technological expertise to develop crucial elements of the model, including the Simple Cloud-Resolving E3SM Atmosphere Model (SCREAM), a computational program that runs on exascale computing systems to make accurate long-term predictions about Earth’s evolving climate patterns.

As understanding and predicting climate change has become a national economic and security priority, insights from studying the changing climate will inform some of the most consequential strategy decisions and policymaking in modern history. “Our climate may be profoundly different in 100 years than it is today—and not just because of expected warmer temperatures,” says staff scientist Peter Caldwell, leader for Livermore’s Climate Modeling Group and for SCREAM. “Changes in precipitation patterns have great potential to produce more severe storms, floods, and droughts. Already, we are experiencing sea-level rise and increased wildfire risk. Clearly, there are national-level implications from food security to infrastructure resilience to geopolitics. Understanding these risks is essential to ensuring we’re equipped to deal with anticipated changes.”

Computer simulation image of water vapor density along the western coast of California.
In this snapshot of a landfalling atmospheric river along the west coast of North America, vertically integrated water vapor, measured in kilograms per square meter (kg/m2), is indicated in transparent grayscale, with fully opaque regions at 40 kg/m2. Colors represent precipitation intensity in millimeters (mm) per day.

While weather forecasts can be validated every day, enabling scientists to construct accurate forecasting models using empirical data, climate models cannot be validated until after climate trends are already in motion. Thus, climate models rely on known physical laws and model fidelity to simulate trends that may take decades or more to arise. “An inherent challenge of climate modeling is to ensure that the assumptions upon which our models operate will continue to hold as the climate evolves,” says Caldwell. 

Climate models do not forecast individual weather events in the short or long term, but rather predict the long-term statistics of weather, such as the average temperature or the typical frequency and intensity of precipitation events. Caldwell explains, “We don’t know exactly when an event will happen, but we do know how frequently it could occur over a given period. We’re not interested in whether it will rain or not on a particular day. Instead, we try to determine whether precipitation will start to cause flooding during a certain decade in the future.” Obtaining a clearer picture of the frequency and severity of extreme weather events allows key stakeholders from the local to federal level to take precautionary actions, for instance, by enacting energy policy changes and reinforcing civil infrastructure.

Building a Better Model

A complete Earth-system model must consider ocean currents, sea ice, growth of trees and plants, and atmospheric phenomena. SCREAM is an integral piece of the larger E3SM project. While E3SM investigates the range of climatological impacts and energy-relevant science using code optimized for DOE’s next-generation supercomputers, SCREAM is responsible for analyzing atmospheric data and trends.

Side by side comparison of satellite imagery and a computer simulation of cloud cover.
SCREAM’s simulated atmospheric phenomena are remarkably similar to real-world observations. (left) An image of a cold-air outbreak off Siberia—captured by the satellite Himawari on January 22, 2020, at approximately noon local time—is compared with (right) SCREAM predictions after two simulated days of atmospheric evolution, using initial conditions such as temperature, water vapor, and clouds at the start time of the model. Darker values (more negative on the scale of watts per square meter) indicate more reflective shortwave radiation back to space due to clouds. Areas with values near zero can be interpreted as cloud-free or near-cloud-free regions.

SCREAM is a high-resolution fluid dynamics model that simulates the evolution of the atmosphere from the fundamental physical equations governing fluid dynamics; radiative transfer; particle collision, coalescence (merging of two or more cloud droplets into one droplet), and sedimentation (cloud droplets falling due to gravity); and many other atmospheric processes. By dividing geographical spaces into a grid of 3.25 square kilometer (km2) tiles, SCREAM operates at a much higher resolution than conventional climate models and can explicitly simulate storms and other atmospheric features at significantly smaller scales. SCREAM can thus address the hallmark challenge of climate and weather forecasting, which is that atmospheric flow depends on processes including scales from a single snowflake all the way to globe-spanning circulations of air and water. Typical climate models divide the world into a grid of 100 km2 boxes to make simulations computationally affordable. As a result, many important processes are too small to resolve on the model grid. These processes must be parameterized, or estimated, based on a mixture of theory and the use of statistical data generated by more detailed, smaller scale models and gathered by observation. Parameterization, however, is a leading source of error in weather and climate predictions. SCREAM’s high resolution decreases reliance on parameterization and correspondingly increases trust in model predictions.

In particular, SCREAM can capture the process of atmospheric convection, during which condensational heating in a cloud causes a bubble of warm air to become buoyant and rise, producing even more condensate, heat, and buoyancy until a cumulonimbus cloud is formed. Convection is critical for circulating heat and water between Earth’s surface and the upper reaches of the troposphere, but it occurs on scales smaller than a typical atmospheric model can capture. Thus, convection must be parameterized in conventional climate models. Parameterizing convection has proven to be as difficult as it is important—even to the extent it is used in SCREAM—because the process takes place at a much smaller scale than conventional model resolutions can achieve and is a nonlinear process that depends on a variety of complex interactions between different factors including temperature, humidity, pressure, and terrain, among others. 

Comparison of three maps showing climate resolution using three different computer models.
SCREAM’s climate modeling capabilities enable more accurate predictions of climate patterns and trends in the future. The illustrations above highlight the difference in simulation resolution between conventional E3SM simulations (left), SCREAM (middle), and Parameter-elevation Regressions on Independent Slopes Model (PRISM) observations (right) from Oregon State University’s PRISM Climate Group for annual average temperature 2 meters above the ground. SCREAM’s higher resolution enables it to provide more accurate information on regional climate features.

In addition to improving accuracy, increased resolution is important for capturing local effects. “San Francisco Bay Area conditions illustrate the usefulness of increased resolution. Consider a day simultaneously sunny in Livermore, rainy in Oakland, and foggy in San Francisco. In a conventional climate model, all three of these climate regimes would be averaged together within a single grid cell,” says Caldwell. Climate change will be felt differently in each of these microclimates, so high-resolution models such as SCREAM are needed for climate adaptation planning.

SCREAM’s higher resolution means that the model is much more computationally expensive than a typical climate model. Each doubling of horizontal resolution across a geographic region (for example, increasing the resolution from 5 km to 2.5 km) increases the number of grid cells fourfold and requires a halving of the model timestep. In all, SCREAM requires approximately 32,000 more calculations per timestep than a conventional climate model. DOE’s new exascale supercomputers—Frontier at Oak Ridge National Laboratory, Aurora at Argonne National Laboratory, and El Capitan at Lawrence Livermore—can perform more than 1 quintillion floating-point operations per second and can execute many of SCREAM’s complex calculations simultaneously.

Effectively using the graphics processing units (GPUs) that power these new computers requires rewriting old codes. Weather and climate models are the product of decades of continual development and consist of millions of lines of Fortran code. As a result, few models can take advantage of these new supercomputers without significant alteration. In addition, each GPU manufacturer supports a different set of commands, and DOE procures machines from all three major computer vendors: Advanced Micro Devices, Inc. (AMD), Intel, and NVIDIA. To use these diverse exascale resources to their fullest, modern high-performance computing codes incorporate performance portability—the capability to transform generic instructions into the unique forms best suited for each machine. SCREAM uses the Kokkos performance portability library created by Sandia National Laboratories. Portability libraries such as Kokkos rely on the flexibility that the C++ programming language offers. Using C++ allows SCREAM and other simulation codes to be written once and then adapted to different machines rather than being recoded for each system. 

Future Climate Outcomes

Over the course of five years, a multilaboratory team of scientists led by Caldwell wrote SCREAM from scratch for the express purpose of running efficiently on exascale computers. The team is now working on an unprecedented 20-year simulation at 3.25-km resolution. By increasing the precision of forecasted climate effects, climate models can provide decisionmakers at all levels of government more insightful and actionable data to inform climate change mitigation plans and preparations for expected effects of climate change. Predicted climate trends, such as increasingly frequent and intense tornadoes, hurricanes, flooding, and wildfires, demand significant and costly recovery efforts and threaten lives. The trends have growing impacts on the economy from local to national levels. Predictive tools such as SCREAM offer unique opportunities to more accurately simulate complex climate patterns, better predict long-lasting impacts, and identify the best courses of action for mitigation and adaptation. 

SCREAM team members, including those at partnering national laboratories and universities, won the 2023 inaugural Association for Computing Machinery Gordon Bell Prize for Climate Modeling in recognition of achieving 1.25 simulated years per day using the entirety of the Frontier exascale system. Says Caldwell, “SCREAM and the entire E3SM project are possible thanks to the huge number of talented staff members who put their all into the effort over the past several years.”

—Elliot Jaffe

For further information contact Peter Caldwell (925) 422-4197 (caldwell19 [at] llnl.gov (caldwell19[at]llnl[dot]gov)).