Strengthening the Power Grid to Weather the Elements

Back to top

Back to top

numerous electrical power poles connected by wires, within a green landscape

 

Severe winter storms fed by a polar vortex swept across Texas in February 2021, plunging temperatures below freezing for days on end in regions such as Austin and Houston better known for extreme heat. Residents heating their homes to combat the frigid weather placed an unmanageable demand for power on an electrical grid that was not designed to withstand such extreme loads or weather conditions. The system operator executed controlled, rolling blackouts, narrowly preventing a total, state-wide grid collapse. As a result of the strained infrastructure, millions of residents and businesses were left without power. Electric and natural gas heaters failed, the water supply faltered, water treatment backed up, and food spoiled, leading to billions of dollars in impacts. More than 200 casualties were attributed to the effects of the weather event. 

Inclement weather is the leading cause of power outages in the United States, and the consequences of power loss—as shown by the February 2021 event in Texas—can be severe. In the interest of bolstering national energy security, a Lawrence Livermore research team led by Philip Cameron-Smith and Jean-Paul Watson completed a Laboratory Directed Research and Development (LDRD) Strategic Initiative (SI) aimed at strengthening the nation’s electrical grid. “We need investments to ensure that the U.S. power infrastructure remains reliable and resilient even as future energy needs and sources, as well as the threat landscape with respect to energy security, continue to evolve,” says Cameron-Smith.   

SI projects, focused on long-term solutions to mission-critical challenges, are the largest in scope within the Laboratory’s LDRD Program. As part of this project, the team developed data-integration methods and computational models necessary to find the most cost-effective and resilient routes to expand the power grid, using a realistic version of California’s grid as a test case. California’s diversity of energy resources and geophysical features, differences in weather patterns, and changing energy needs highlight the importance of modeling resource availability and demand variability. With additional research talent from university partners—San José State University, the University of California (UC) at Davis, and UC Berkeley—the multidisciplinary and multidirectorate endeavor united Lawrence Livermore’s deep expertise in energy security, machine learning, high-performance computing, and Earth system models, including wildfire and hydrological modeling. 

Optimizing Investments

Locations of power generators indicated with with dots (top map), locations of transmission lines indicated with lines (middle map), and locations of power storage indicated with dots (bottom map).
The power grid capacity expansion and optimization model yielded a set of infrastructure investments to maximize the grid’s reliability while minimizing cost. Depicted are the model’s determinations of new California generators (top), transmission lines (middle), and storage options (bottom), optimized for a 2045 scenario. In the top map, colored dots represent different fuel sources, such as natural gas or geothermal energy, for power generation. In the middle map, purple lines represent new transmission lines. In the bottom map, colored dots represent different energy storage options, such as batteries and hydroelectric pumped storage.

Strengthening the power grid entails increasing both its reliability—the ability to regularly meet demand—and its resilience—the ability to withstand extreme threats in a cost-effective manner. Simultaneously expanding and hardening the power grid necessitates strategic investments in each class of electrical infrastructure: generation, transmission, and storage. Given the breadth of possible investment avenues for each infrastructure class, governments and private utilities alike need a way to judiciously determine the most effective capacity expansion projects, while also weighing economic and legal considerations of each option. While many such projects might seem far off in the future, investment decisions are already having major effects on the power grid. For instance, Diablo Canyon, California’s only operational nuclear power plant, which recently received extensive state financial support to postpone its scheduled shutdown to 2030, will continue providing electricity for approximately 3 million Californians, and bolster grid reliability. 

The SI project team focused on identifying and integrating diverse data sources on the weather and the electrical grid and developing a computational model, optimized using Pyomo (a software package of which Watson is a lead developer), to identify the most cost-effective set of power grid infrastructure investments that will meet California state requirements and future power demands for a range of possible weather conditions. A grid that is cost efficient 99 percent of the time, but plunges users into darkness in the evenings of long, hot, windless summer days is not acceptable. “I encourage people to think of this work as developing an exploratory tool. The optimized investment plan depends on what assumptions are provided to it,” says Tomas Valencia Zuluaga, a postdoctoral researcher in the Laboratory’s Center for Applied Scientific Computing. Ultimately, the model can be adapted to any region’s unique needs by adjusting data related to local patterns of energy consumption, weather risks, transmission lines, and energy technologies (available and emerging). 

The model’s proposed investment plans draw from a portfolio of options for power generation, transmission, and storage infrastructure. “On the generation side, we consider solar power and wind facilities, and in certain areas, we could also place natural gas units, biomass, hydropower, geothermal, and other facilities,” explains Watson. The model can even accommodate not-yet-realized technologies such as fusion energy plants. In the nearer future, however, users can investigate the wide deployment of small modular reactors, which are miniaturized nuclear fission reactors whose small footprint and affordability are attractive to technology companies looking to satisfy the demands of new data centers. 

Generating energy is not helpful, however, if it cannot be either immediately supplied to consumers or stored for future use. Therefore, the team next looked downstream from power generation to transmission lines and substations, where high-voltage electricity is converted to lower voltage for safe distribution to consumers. Due to permitting and right-of-way considerations, Watson explains that the most feasible transmission upgrades involve either building additional lines within corridors already in use or modifying existing transmission lines to make them more efficient and operate at higher capacity. Finally, the team considered energy storage options, including large-scale lithium-ion batteries. In general, battery systems stand to become even more efficient and store greater amounts of power given ongoing materials science developments. Other viable storage system alternatives include pumped storage hydropower, which leverages the potential energy of water pumped to a higher elevation.

High-Power Computing Approach

The researchers’ approach to narrowing down grid configurations involves two coupled optimization stages. In the first stage, they determine a set of infrastructure investments that meets projected electricity demand at minimum cost. Then, in the second stage, they subject this grid model configuration to different weather scenarios and assess how well the proposed grid fares in demanding conditions. 

Optimization is much easier said than done, however. “In our California test case, the grid contains roughly 9,000 buses (nodes where multiple circuits converge) and 11,000 transmission lines, offering many possible decisions from all the combinations of infrastructure locations and their operations,” says Elizabeth Glista, an operations research engineer in the Laboratory’s Computational Engineering Division. Valencia explains how these many elements complicate the task of optimization. “Our representation of the power grid uses a combination of binary, integer, and continuous variables,” he says. “For example, we model transmission lines as binary variables—they are either built or not built. The number of power production units built and their capacities, however, are integer values. Certain types of energy generation cannot be fractional. We can’t build half of a combustion turbine, but we can scale a solar farm by almost any amount.” 

The presence of binary and integer variables coupled with complex constraints makes their optimization task an NP-hard computing problem. (NP-hard problems are a class of computing problems whose solutions are simple to verify yet extraordinarily difficult—sometimes impossible—to obtain unless the problem is simplified, or “decomposed,” to some degree.) “We spent a great deal of time considering how to decompose the associated problems and utilize our computational resources in clever ways to obtain results in a reasonable timeframe,” says Valencia. The team considered a range of optimization scenarios that differ in how patterns of energy generation and consumption are projected to evolve over coming decades. Then, rather than simply determining investments for one demand scenario at a time, team members applied a stochastic programming methodology to simultaneously minimize investment cost across this range of future scenarios. This approach to computational problem-solving accounts for the intrinsic uncertainty that comes with forecasting future conditions—uncertainty introduced by possible technology improvements or policy changes, for example—and it encourages the grid solutions to feature realistically achievable modifications. 

Pacific Gas and Electric serves most of central and northern California; Southern California Edison serves most of southern California, except for the San Diego metropolitan area, which is served by San Diego Gas and Electric.
A California map indicates the geographical area represented by each of the three utilities (below).

Heating and cooling buildings represent a significant portion of the nation’s electricity demand. Therefore, understanding the effects of weather on energy demand is necessary to assess the reliability and resilience of future power grid configurations. When severe weather and extreme temperatures occur, providing power to people becomes all the more critical. Moreover, weather also impacts the ability of many types of power plants to provide electricity in the first place—and the ability of transmission lines to transmit power to users—making a case for new power generation and delivery infrastructure, too. Obtaining accurate, high-resolution simulations of possible future weather scenarios is, therefore, vital for making informed decisions about whether, where, and how much to build, modify, or expand energy infrastructure, as well as what type of infrastructure to build, especially as the United States is poised for a rapid increase in demand due to increased manufacturing, additional computing resources for AI, and expanding electricity service to new areas. High-resolution Earth system simulations—about 3-kilometer (km) grid spacing—are required to provide a spatial resolution that resolves mountain passes funneling wind for wind turbines, and output is needed hourly to test whether power generation can meet demand throughout each day.

To enable the optimization model to incorporate the link between weather and its induced patterns on electricity demand, postdoctoral researcher Minda Monteagudo developed an electrical load model that enables the team to input any temperature time series—past, present, or possible future—and predict the associated electricity demand with hourly temporal resolution. Trained by regressing historical weather data on utility-reported demand, the load model predicts time- and temperature-dependent power demand throughout California while accounting for calendar effects such as time of day, season, and day of the week. 

Plots of energy demand for Pacific Gas and Electric (top plot), Southern California Edison (middle plot), and San Diego Gas and Electric (bottom plot) indicate that ranges of energy demand vary across the state, yet remain steady for each utility until temperatures reach 80 degrees Fahrenheit and above during summer and late fall.
Weather conditions influence demand for energy. Plots reveal the nonlinear relationship between temperature (in Fahrenheit) and electricity demand (in megawatt-hours) throughout a calendar year across three California utility regions. Demand values are sourced from utility provider reports from July 2018 to December 2022.

Equipped with an understanding of demand as a function of temperature and time, the Lawrence Livermore team can project future demand using representative days of weather—archetypes of daily weather conditions from scenarios simulated by Earth system models that mathematically integrate the planet’s physical, chemical, and biological processes. Each representative day has distinct influences on the expected patterns of energy production and consumption. “For example, on a specific simulated day in 2045—what we refer to as the ‘target horizon’—a location may experience more wind and less sun than usual. In this scenario, if only solar installations exist nearby, the grid would draw on electricity stored in batteries or other systems to satisfy demand not immediately met by solar power,” says Valencia. “The driving question is, how many representative days must we account for to capture sufficient weather variability at the target horizon?” Using fewer representative days simplifies calculations for the capacity expansion problem, but at the cost of limiting how much weather variability can be captured.

Without sufficient weather examples, the optimizer may provide a design for the grid that will fail if a weather event occurs that had not been considered, which carries direct, downstream impacts for the reliability and resilience of the realized grid. Thus, investigating numerous representative days covering the breadth of weather scenarios is desirable to address the intrinsic weather variability, as simulated by the Earth system models. Hourly time-series data for individual representative days in the target horizon is distributed to different computers within a parallel computer cluster; each computer then returns a power grid investment plan optimized for the weather time series provided to it. Solutions among computers will likely differ at first. “If one computer were to see a particularly cloudy day, it might determine that more hydropower is ideal overall. Likewise, if another computer sees a particularly sunny day, then it would recommend solar power investments because that is the most available resource. We are seeking a single investment plan, so the solutions must match,” says Valencia. Using the Laboratory’s Quartz computing cluster, the team could optimize grid investment plans across hundreds of weather scenarios at once. By strategically penalizing mismatched optimization solutions produced by different computers within the cluster, their algorithm can quickly converge on a single solution.

Tracing Data Streams to the Source

The overarching grid optimization model comprises a constellation of computer scripts that ingest Earth system modeling data alongside infrastructure-focused technoeconomic considerations such as the locations of sited infrastructure, new infrastructure costs, generator and storage capacity, and many other factors. Constructing a software pipeline to join process data from these different sources into the form required to optimize the electrical grid required painstaking effort.

The team used the Department of Energy’s (DOE’s) well-established Energy Exascale Earth System Model (E3SM) to generate weather patterns for future scenarios of interest. With submodels capturing land, sea, rivers, ice, and the atmosphere, E3SM was built with DOE supercomputers in mind. However, properly integrating the E3SM data with other data streams in the pipeline calls for extensive conversions, corrections, and restructuring of data. “An enormous amount of work goes into turning existing Earth system data into the forms necessary for our purposes,” says Monteagudo. She explains that many of her efforts relate to assessing potential data sources for quality, relevance, and compatibility. “Does the data source support the variables that we need? Does it have the necessary resolution? Does it contain underlying statistical biases? We have to ask many questions to ensure that a particular data source will help us answer the questions we are interested in,” she says. 

For instance, the team drew upon the Systems Advisor Model (SAM), a technoeconomic analysis tool developed by DOE’s National Renewable Energy Laboratory (NREL), to estimate the output of wind and solar power facilities for a given weather scenario. “We found that the wind and solar output data produced by E3SM are not always a one-to-one match for what NREL’s SAM would like to work with,” says Monteagudo. For example, E3SM produced solar irradiance measurements from the perspective of planes parallel to Earth’s surface. SAM, instead, utilizes measurements taken from planes perpendicular to incoming solar radiation. Other data preprocessing checks are similarly meticulous, such as choosing the appropriate brightness measurements for the Sun and deciding if hourly time-series data is collected at the beginning, middle, or end of the hour. “These are very in-the-weeds sorts of tasks, but if we didn’t perform them—and if we didn’t have the in-house expertise to know they were necessary—then we would obtain inaccurate power estimates without knowing the underlying reasons,” says Monteagudo.

Although data preprocessing ensures the data pipeline draws on compatible data sources, the team still had to contend with intrinsic statistical factors—referred to as biases—that affect the predictions of these models. One source of such biases can arise from natural fluctuations such as El Niño and the Pacific Decadal Oscillation, which take place over longer periods of time than are normally simulated by weather models. Given the complexity of these phenomena, there is no one-size-fits-all approach to adjusting these factors. “Biases are hardly ever so simple as values being one degree too warm or cold across the board. They depend on one’s location, weather patterns, and many other considerations,” says Cameron-Smith. 

Researchers must take these factors into account because they have direct ramifications for infrastructure decisions. For example, many power plants—regardless of whether they use coal, biomass, natural gas, or nuclear fuel for energy—rely on river water for cooling, which is returned at a higher temperature to the river source. However, high water temperature can cause die-off of aquatic life that people depend on for their livelihoods, so there are limits on the temperature of water that can be released by such a power plant back into the river. Hence, if the river’s water temperature rises at the power plant’s inlet, less heat can be absorbed by the water before it is discharged back to the river, and energy production must be reduced. If the water exceeds a threshold temperature, the plant may need to be turned off entirely. Cameron-Smith explains the impact of temperature biases when simulating such systems: “If our predicted values for river temperature are consistently too high, then we would wrongly reason that the power plant could never run, so we should invest elsewhere. On the other hand, if the readings are consistently too low, we would incorrectly predict that the plant could function without issue.”

Seeking High Resolution

High-resolution data directly impacts the optimization model’s determinations for infrastructure investments. Cameron-Smith explains, “Wind turbines are, of course, placed where it’s windy. These conditions often occur near mountain passes because the topography acts as a funnel for air currents. At 100-kilometer resolution, standard atmospheric models can hardly resolve mountain ranges, let alone individual mountain passes. Even at 25 kilometers, which many modelers would consider high resolution, we found that the implied power generation was still not accurate enough.”

A grid pattern covering Earth with a dark area over California representing the higher-resolution domain.
The Energy Exascale Earth System Model (E3SM) was used to develop a regionally refined model (RRM) and to simulate atmospheric phenomena at 3-kilometer (km) resolution over California, far exceeding E3SM’s standard 100-km resolution. Higher spatial resolution helps to provide better predictions of atmospheric phenomena, such as the winter precipitation pictured below.

Fortunately, an extremely high-resolution version of E3SM’s atmospheric model known as Simple Cloud-Resolving E3SM Atmosphere Model (SCREAM) has recently been developed to operate with a grid spacing of just 3 km. (See S&TR, June 2024, Understanding the Changing Climate.) However, simulating the entire globe at this resolution is so computationally slow and expensive that not enough weather events could be simulated for the electrical grid optimization, and outputting the weather variables every hour would produce too much data to handle. Therefore, the SI team developed and tested a regionally refined model (RRM) version of SCREAM that used a stretched grid so that only the atmosphere over the region of interest—in this case, California—needed to be simulated with 3-km grid spacing, while the standard 100-km spacing applied elsewhere. RRM satisfied the tradeoff between simulation resolution and its associated compute time and storage requirements. Cameron-Smith explains that the team used California as a test case for developing their methodology because, apart from being home to Lawrence Livermore, the state’s power grid is an exciting challenge to model. “California is large and geographically diverse, and it experiences significant variations in meteorological conditions. We experience a range of weather-related events, including atmospheric rivers, wildfires, and fog, that impact energy production and consumption.” Using this RRM configuration, the team produced four individual 5-year simulations of weather through the end of the 21st century with the hourly output needed for the models that relate weather conditions to electrical generation and consumer demand.

At standard resolution, a narrow range of predicted precipitation appears across large sections of the state (top row). At higher resolution, a wider range of possible precipitation levels can be pinpointed to specific regions (bottom row).
Higher quality weather predictions obtained with RRM are critical to power infrastructure decision-making. (top row) E3SM predictions show average winter precipitation for a typical year and an extremely wet year at E3SM’s standard 100-km resolution. (bottom row) The higher spatial resolution provided by RRM shows significantly more precision to better inform the efficacy of hydropower facilities, which rely on rainfall and snowmelt.

In addition to the impact of daily weather on the electrical grid, extreme events can also provide shocks to the system that test the resilience of the grid. Two such low-frequency, high-impact threats are wildfires and floods. Few threats are as palpable among California residents as wildfires, which in recent years have led to unanticipated power loss and rising home insurance costs—let alone imminent threats to human life. Unusually high temperatures, dry vegetation, and strong winds create the greatest opportunities for fire to ignite and to spread, and this precise combination of factors led to the outbreak of devastating wildfires in Los Angeles in January 2025. At the other end of the scale, floods can damage power lines, substations, and transmission equipment, leading to outages and safety risks. For both wildfires and floods, a longer-term view of weather history is required, since the conditions for fires and floods usually take weeks, months, or years to develop. To better inform both fire and flood risks, the team developed machine-learning systems that leverage historical observations to forecast dead fuel moisture and river flow across California. For wildfire prediction, the fuel moisture model is critical to the flammability of dead vegetative fuels, such as fallen sticks and branches, driving fire behavior. In parallel, river flow forecasts support early warnings for flooding by capturing hydrologic responses to precipitation and snowmelt. This will facilitate inclusion of such extreme events stochastically into the grid optimization system in the future to make sure the recommended grid will also be resilient to such catastrophic events.

Although the LDRD project has concluded, the effort yielded new computational capabilities that enable additional research efforts—for instance, finding grid solutions that incorporate distributed energy resources (DERs). DERs—a class of small-scale, consumer-operated generation and storage devices that include rooftop solar panels and associated battery storage—are being installed at increasing rates. Further areas of interest include studying new regions, tackling challenges related to new energy sources, such as small modular reactors, and exploring rapid technological developments in energy storage, including new chemical battery technologies—gravity batteries that lift heavy weights and flywheels. 

Two men pointing to a large screen.
Philip Cameron-Smith (left) and Jean-Paul Watson led the Strategic Initiative to develop a computational model relating weather events with choices for optimizing power grid infrastructure investments. Shown here, Cameron-Smith and Watson refer to a model of different power-generation and energy storage options optimized over a 24-hour period.

Cameron-Smith says a fundamental goal of LDRD SI research projects is to facilitate exchanges and collaborations across directorates, which is evidenced by this team’s combination of subject-matter experts from Livermore’s Physical and Life Sciences, Global Security, Computing, and Engineering principal directorates as well as university partners. “Of course, we have excellent codes and powerful computers, but what I argue proved even more critical is the domain expertise provided by our research team to bridge the gap between environmental science and infrastructure,” he says. “The nature of these projects is to get scientists reaching across directorates and talking to each other. I believe Lawrence Livermore does this particularly well. Even if the collaboration requires extra effort, that effort lets us get to the science and develop impactful capabilities.”

—Elliot Jaffe

For further information contact Philip Cameron-Smith (925) 423-6634 (cameronsmith1 [at] llnl.gov (cameronsmith1[at]llnl[dot]gov)) or Jean-Paul Watson (925) 424-3923 (watson61 [at] llnl.gov (watson61[at]llnl[dot]gov)).