DOZENS of climate models are in use throughout the world, and their predictions for global warming as a result of a doubling in atmospheric carbon dioxide vary from about 1 to 5°C over the next 30 years. A globally averaged increase of 1°C might not matter in many parts of the world, but the larger increase could mean vast changes in snowpack, rainfall, water availability, crop production, and ocean levels, affecting billions of people.
Predictions and accompanying margins of error are used constantly, for example, to foresee where the economy is headed, determine how much will be needed in the Social Security fund for aging boomers, estimate future oil production from a particular well, anticipate the efficacy of a new drug, or determine the chance of a terrorist attack in a U.S. city. Occasionally, the magnitude of the uncertainty can rival or even exceed the value of the prediction.
How to reduce uncertainty can be unclear, in part, because it can take many forms. For example, uncertainty may exist in regard to the assumptions and inputs to a model, the errors associated with experimental data, or the approximations inherent in the physics, numerical algorithms, and mathematics of the model itself. Furthermore, a prediction may include uncertainties from many factors that may be interrelated.
“At Livermore, significant advances in uncertainty quantification have been made in the weapons program” says Richard Klein, a theoretical astrophysicist in the Laboratory’s weapons program and a professor of astronomy at the University of California at Berkeley. Several years ago, Lawrence Livermore and Los Alamos national laboratories worked together to develop an improved methodology for assessing the performance of nuclear weapon systems without nuclear testing. (See S&TR, March 2004, A Better Method for Certifying the Nuclear Stockpile.) Known as quantification of margins and uncertainties, the work entailed systematically combining the latest data from computer simulations, past nuclear tests, nonnuclear experiments, and theoretical studies to quantify confidence factors for the key potential failure modes in each weapon system in the stockpile.
Recognizing the applicability of this work to a wide range of scientific fields, a large collaboration at Livermore began studying uncertainty quantification (UQ) and error analysis. Klein leads this three-year Laboratory Directed Research and Development Strategic Initiative involving more than 20 scientists from four organizations: Weapons and Complex Integration, Physical and Life Sciences, Computation, and Engineering. “The experts in software, mathematics, statistics, and physics from these organizations create highly complex models and routinely deal with uncertainty,” says Klein .“With this research, our goal is to get them speaking the same language and advancing the science of UQ.”
Organizations from around the world have also been searching for ways to identify sources of uncertainty to improve the predictive capability of models. The Livermore project, which began in October 2009, brings to the table not only the Laboratory’s unique combination of expertise but also some of the largest, most powerful computers in the world.
First on the Agenda
The primary U.S. climate model, known as the Community Climate System Model, is managed by the National Center for Atmospheric Research in Boulder, Colorado. The model simulates Earth’s past, present, and future global climate. The Laboratory has a long history of involvement in atmospheric research through its National Atmospheric Release Advisory Center and works primarily on the Community Atmosphere Model, one component of the larger model.
Atmospheric scientist Curt Covey, who has been involved in climate modeling for more than 20 years, notes that uncertainties have always been addressed in climate models. “However, applying UQ at the same level of rigor as it is being used in the weapons program is new.” Working with Covey are Don Lucas, John Tannahill, and Yuying Zhang of the Laboratory’s Atmospheric, Earth, and Energy Division.
The Curse of Dimensionality
Input parameters and their associated uncertainties are known to statisticians as dimensions, and the more dimensions, the less merry the statistician’s task. Two dimensions are easy enough to solve, as are three. Beyond that, the difficulty of accommodating all the different uncertainties grows exponentially, outstripping the capacity of the most powerful computers—a problem known as “the curse of dimensionality.”
Charles Tong, a mathematician on the project, likens the curse of dimensionality to the old story of a blind man trying to identify an elephant by touching it part by part. Feeling the eyelashes leads to one conclusion about what the animal looks like, while feeling the trunk yields a different conclusion. And so it is with climate models. Upwards of 100 parameters can influence simulation predictions in climate models, and each has associated uncertainties, leading to very different results.
Uncertain climate model parameters include the humidity at which clouds form, the size of liquid droplets that make up clouds, the size at which droplets convert to rain, and many more. Covey’s team narrowed 100 or so climate parameters to the 21 most important for initial UQ studies. With a 21-dimensional hypercube, more than 2 million corners exist, and traditional Monte Carlo calculation methodologies (the shotgun approach) examine only a miniscule fraction of the total volume. Johannesson and Tong, together with physicists Bryan Johnson and Scott Brandon and mathematician Carol Woodward, are working to develop tools that reduce dimensional requirements. They are, for example, determining which parameters are most sensitive to changes in other parameters. Mathematician Timo Bremer is working to develop a new topological method for expressing dimensionality.
A Predictive Pipeline
According to computer scientist David Domyancic, the pipeline will save expensive computer time and ultimately will be an automated decision-making tool. Klein says, “The pipeline will advance the process of integrating theory, simulation, and experiment—a major leap forward in UQ technology.”
Many of the same methodologies used successfully for stockpile stewardship are being applied to climate modeling as well as to target design for inertial confinement fusion experiments at the Laboratory’s National Ignition Facility. As UQ expands into other fields under Livermore’s direction, Klein hopes to establish a UQ institute at the Laboratory. “I see the work we are doing now as the first brick in the institute.”
Key Words: climate modeling, predictive pipeline, uncertainty quantification (UQ).
For further information contact Richard Klein (925) 422-3548 (email@example.com).
Lawrence Livermore National Laboratory
Privacy & Legal Notice | UCRL-TR-52000-10-7/8 | July 14, 2010