Lawrence Livermore National Laboratory



Back to top

Back to top

Advanced Beetle Models Biofuel Production

Horned passalid beetles (Odontotaenius disjunctus) transform low-nutrient wood into food for their colonies as well as beneficial organic matter for forest soils. Lawrence Livermore bioscientist Jennifer Pett-Ridge, with other members of a team led by Lawrence Berkeley National Laboratory’s Javier Ceja-Navarro, have examined the beetles’ ability to achieve this chemically challenging goal as a model for renewable fuel production. The research was published in the March 11, 2019, edition of Nature Microbiology.

The beetles break down lignin and cellulose in fallen logs and convert the material into simple sugars and organic acids via partnership with microorganisms in their digestive tracts and a compartmentalized gut. Each section of the gut supports a different microenvironment—some acidic, some anaerobic—that sequentially degrades the wood and extracts energy. “We study these beetles because they are a natural biorefinery,” says Pett-Ridge. “Understanding how evolution has solved the complex lignocellulose-to-fuel conversion process can help us design better industrial mimics and find novel enzymes or pathways.”

Non-food, lignocellulosic feedstocks proposed for biofuel production share a similar composition with the beetles’ woody food source, making the insects’ digestive process a model for efficient conversion of plants into hydrogen, methane, ethanol, and other bioenergy products. The team defined the digestive microorganisms and identified gut anatomical characteristics contributing to efficient metabolism of lignocellulosic material. Says Pett-Ridge, “Gaining insight into how the gut microbiome populations interact to deconstruct lignocellulosic materials to sugars or potential biofuels could potentially aid in the optimization of industrial cellulosic degradation.”
Contact: Jennifer Pett-Ridge (925) 424-2882 (pettridge2@llnl.gov).

Researchers Solve a 50-Year-Old Puzzle

An international team, including Lawrence Livermore scientists, has closed a long-standing gap in physicists’ understanding of beta decay—a process in which protons inside atomic nuclei convert into neutrons, or vice versa, to form the nuclei of other elements. Their work, published in the March 11, 2019, edition of Nature Physics, helps explain why experimental beta decay rates in atomic nuclei are slower than calculated rates.

Historically, nuclear physicists have described the beta decay rate by artificially scaling the interaction of single nucleons with the electroweak force, a process referred to as “quenching.” However, predictive methods require accurate calculations of the structure of both the mother and daughter nuclei and how nucleons (individually and as correlated pairs) couple to the electroweak force that drives beta decay. The team simulated decay rates from light to heavy nuclei using high-performance computing resources at Livermore and Oak Ridge National Laboratory, and demonstrated that their approach works consistently across nuclei where ab initio calculations are possible. “By combining modern theoretical tools with advanced computation, we have shown that for a considerable number of nuclei, we can reconcile the discrepancy between experimental measurements and theoretical calculations,” says Livermore nuclear physicist and co-author Kyle Wendt. Researchers also demonstrated that the apparent quenching arose from forces within nuclei.

The research effort sets the path toward accurate predictions of beta decay rates for unstable nuclei in violent astrophysical environments, such as supernova explosions or neutron star mergers. Livermore nuclear physicist Sofia Quaglioni, co-author of the paper, says, “The methodology in this work may hold the key to accurate predictions of the elusive neutrinoless double beta decay, a process that, if seen, would revolutionize our understanding of particle physics.”
Contact: Sofia Quaglioni (925) 422-8152 (quaglioni1@llnl.gov).

Reducing Climate Model Uncertainty

Earth System Models (ESMs), which are run on advanced computers to simulate aspects of Earth’s variability (see image at left), have improved and grown in complexity since their first use in the 1970s, but inconsistencies remain. In a study of the “emergent constraint” (EC) approach for ESM evaluation, Lawrence Livermore scientist Stephen Klein and collaborators investigated EC in the models and found that this method may improve ESMs by focusing attention on the variables most relevant to climate projections. The research appeared in the March 18, 2019, edition of Nature Climate Change.

Trusting climate models requires a mechanistic understanding of the reason for the relationship between the current climate variable and how the variable will change in an evolving climate. The EC approach combines climate simulations with contemporary measurements and seeks variables from the current climate to narrow uncertainties in projections for climate change parameters, such as temperature. “The EC approach offers a promising way to reduce key uncertainties in future climate,” says Klein, co-author of the paper. “It could also pave the way for further discoveries about climate system behavior and reduce the uncertainty in critical aspects of climate change.”

With support from the Department of Energy’s Office of Science, the research team created a framework to assess EC methods and provide indicators that EC is moving toward trusted statistical relationships. According to Klein, future applications of an EC approach include modeling to identify climate system tipping points. More consistent climate change projections could help society better plan for future environmental and economic impacts.
Contact: Stephen Klein (925) 423-9777 (klein21@llnl.gov).