Additive manufacturing (AM) offers researchers unprecedented flexibility to print structures with shapes, properties, and behaviors that do not readily exist in nature and with microscopic features that were previously impossible. AM makes structures with “metamaterials” by arranging multiple materials, at small scales, to tailor their responsiveness to different types of physics phenomena, providing them with the abilities to change shape, phases, and properties.
Simultaneously, AM poses a design challenge, as leveraging its many available material layouts can mean considering billions of design possibilities. At best, so many options can lead to a lengthy and costly trial-and-error design process that is one step above guesswork and sometimes incapacitating. “Having the ability to print at a very fine scale and to print different materials at different locations creates a huge design problem because we can’t simply guess and achieve what we want with trial and error anymore,” says Dan Tortorelli, the director of Livermore’s Center for Design Optimization (CDO).
Computational design optimization can resolve these extremely large and complex design problems that cannot be solved by intuition, prior experience, or trial and error. Design optimization is a formal mathematical process of finding the best possible design, given an objective (such as maximizing strength) and a set of constraints (such as cost or weight). The computer generates optimized designs automatically by iteratively modifying the design until it achieves the best outcome. “Using the traditional design method, we took existing designs and kept modifying them to meet new specifications, but we never knew if we were leaving something on the table. Maybe we could have saved more mass or made it stronger,” says Tortorelli. The computer also finds solutions that may not be intuitive to humans, which is critical for working with metamaterials, where there are few, if any, reference points for “good existing designs” as a point of departure. “If the goal is to make a strong bridge, many bridges exist to reference for generating ideas. However, when multiple physics equations governing a system and many variables that affect the design exist simultaneously, using intuition to effectively design something becomes impossible,” says optimization engineer Kenny Swartz.
Since its inception in 2017, CDO has been leading a design revolution, developing software tools and optimization techniques to complement advances in AM and solve the Laboratory’s largest and most complicated design problems. (See S&TR March 2018, Leading a Revolution in Design.) The center’s work revolves around two Livermore-developed high-performance computing software codes—Livermore Design Optimization (LiDO) and Smith—that automate the most challenging parts of the design process. “If we had an optimization problem that was complicated to set up, we couldn’t test our first idea for weeks because of how long the process took. If we wanted to try something else, the timeline extended another week or two. This iteration time can really bog down a project,” says Swartz. “Now, once we understand what someone’s needs are, we can get that first iteration done in just a day or two.”
Problems with Problems
Smith and LiDO are separate codes with separate goals, but they have a symbiotic relationship. Using LiDO, researchers can quickly pose and modify optimization problems and automatically generate nonintuitive designs. Smith, on the other hand, enables researchers to easily simulate different designs using finite element analysis to evaluate how well each meets the optimization objective and constraint functions. The code computes their derivatives with respect to the design parameters to ascertain what change would be more optimal. Together, the two codes help the Laboratory innovate faster and respond quickly to national security challenges. “Design has always been an iterative process, and we’re trying to accelerate it,” says engineer Brandon Talamini, who leads the Smith team. “Designers are still in control and the idea of what makes a good design continues to depend on an expert, but Smith and LiDO give that person the tools to make these iterations faster.”
Optimization goals can vary widely, but the most common include topology optimization—improving the distribution of different materials in a given spatial region—and shape optimization, which focuses on refining the shape of the region. Other cases are possible, for example, to enhance manufacturability of the design. These optimization techniques can be applied to many problems, from light-weighting parts (minimizing weight without compromising performance) to maximizing energy density in batteries to designing microarchitectures to obtain desired, even novel, material behavior. “We can help with objects that vary smoothly naturally or that can be allowed to vary smoothly, such as the shape of a part,” says Seth Watts, a computational engineer who leads the LiDO team. “That smoothness enables us to define derivatives, and those derivatives then enable us to scale millions or billions of design parameters.”
At their core, optimization problems are all mathematics. Posing an optimization question means defining the objective and constraints as functions of design variables (parameters) that describe the material layout and the shape of the region that can be changed. Everything in the optimization problem needs to be expressed as functions, essentially translating the intuitive idea of a good design into equations. “Usually, understanding what people are trying to do with a design and developing metrics that capture their intent takes a few conversations, because at the end of the day, we have to translate that real-life need into a math problem,” says Swartz.
Solving an optimization problem can involve significant efforts to write code, debug errors, and resolve complicated equations. Optimization programs can also return completely different designs depending on whether they are asked to optimize for weight, cost, or other factors. “Optimization solvers are similar to genies. They provide exactly what was requested but not necessarily what someone intended to request,” says Watts. Even then, a solution might still not be successful in meeting a need, requiring further design iteration. “I don’t think anyone has ever written an optimization problem exactly right the first time, so inevitably, we solve many different optimization problems to arrive at the problem that gives us the solution we want,” adds Watts.
A Tale of Two Codes
Development of the LiDO (Livermore Design Optimization) code started in 2017 as the Center for Design Optimization’s (CDO’s) first Laboratory Directed Research and Development (LDRD) project to help researchers solve optimization problems for complex additive manufactured (AM)-enabled designs. Work using Smith began in 2020 to provide a flexible framework to simulate design performance for all types of engineering systems. “We offer a framework for analysis that’s compatible with design optimization and easily adaptable in a full nonlinear environment,” says Jamie Bramwell, who led Smith’s development from 2020 to 2023. “The fact that we can answer design questions and develop state-of-the-art models at the same time is revolutionary.”
Both LiDO and Smith codes leverage software libraries from Livermore’s RADIUSS (Rapid Application Development via an Institutional Universal Software Stack) ecosystem, which hastens development time and takes advantage of the Laboratory’s high-performance computing environment to efficiently solve large, complicated design problems, even as the physics become increasingly complex. The codes grew together as the teams recognized how much they could help each other. Optimization engineer Kenny Swartz says, “LiDO is Smith’s biggest customer, but at the same time, Smith enables pretty much everything that we do in LiDO, leading to a mutually beneficial relationship.”
The Differentiating Factor
To solve design problems with millions or billions of design variables, researchers need to calculate gradients of the optimization cost and constraint functions. Gradients quantify how sensitive cost and constraint functions are to design parameters and provide the means for optimization programs to effectively modify the design. Similar to how the slope of a hill would direct the movement of a ball rolling down to its lowest possible location, gradients direct a design toward its optimal outcome in the quickest manner. Gradients are computed by differentiating the entire chain of finite element simulations—known as forward simulations—that predict how a given design will perform. Each step in the calculation contributes to the final computed values of the objective and constraint functions, which means the sensitivity of each step needs to be considered when evaluating the gradients. Since partial differential equations are solved in the forward simulations, computing the gradients is a difficult process. “The function and input are extremely complex. While writing the function is standard for anybody in our field, evaluating derivatives is a very difficult task,” says Tortorelli.
Although Livermore has many codes that excel at specific types of forward simulations, simulating new systems such as those involving metamaterials can be complicated and time-consuming. “Many of our production codes are very good at modeling the sorts of materials we use for the Laboratory’s missions, but with AM, we’re literally creating new materials that have no models,” says Watts. “We have to create the models ourselves, and they’re invariably weird, nonlinear, and complex.” The Smith code provides a forward simulation platform that is easy to use even with complicated designs and also computes gradients automatically. “We can tell the code to take a derivative of itself,” says Smith code developer Jamie Bramwell. “While that sounds simple, not many codes have such a capability, especially for the kinds of complex, nonlinear, or large-scale physics that Livermore simulates.”
Smith’s automatic differentiation is a novel feature for optimization that eradicates one of the longest and most difficult steps in the process. “Speaking from painful experience, ensuring that those derivatives are correctly derived and implemented with paper and pencil is the stage at which we would spend all of our time debugging an optimization code, even for simple models,” says Watts. “The fact that the derivatives are exact and can be generated automatically using Smith without having to write code significantly accelerates the process.”
Smith’s automatic differentiation capabilities are especially useful for metamaterial design, giving researchers more time to focus on what makes a design good instead of becoming lost in calculations. “Smith enables us to elevate our reasoning up a level and keep our energy focused on posing the optimization question,” says Bramwell.
Modularity Contest
LiDO and Smith’s hallmarks are their modularity and ability to support many kinds of physics while tackling large problems. Engineers can use the codes to quickly generate designs, with different goals for the same system—all without having to start from the very beginning each time. “I think of them as building blocks,” says Tortorelli. “Each block has some functionality, and we can combine them to solve an optimization problem. If we need to add a new block that has our functionality, the code provides a simple solution to make one.”
Many optimization problems at the Laboratory start with a finite element model of a system, which breaks down a complicated 3D region into a gridlike map of discrete (finite) voxels that a computer can process and use to solve the system’s governing equations—similar to the process for rendering computer graphics. The model equations consider the shape of the region as well as its material properties, the forces that act on it, and the underlying physics that govern its behavior, for example the conservation of mass, momentum, and energy to predict the distortion of a space structure when heated. LiDO and Smith are both built on top of the Laboratory’s Modular Finite Element Methods (MFEM) library. MFEM provides scalable and effective implementations of the mathematical abstractions used in finite element theory that are applicable to many kinds of physics.
Optimization engineer Jorge-Luis Barrera Cruz used this capability to optimize active devices, which can change shape in response to stimuli such as light, heat, and electricity. (See S&TR, April/May 2025, Printing the Future of Fusion Targets). “Smith enables us to quickly prototype simple setups,” he says. “The implementation is very lean, so we don’t spend too much effort into adapting material models that predict complexity and highly nonlinear material responses into the code.”
Digital Twins
Converting optimized designs into instructions that 3D printers can use to print them can be challenging. For large lattice structures, the optimized design is described by a surface mesh with millions of data points that take multiple steps to distill into instructions the printer can process. Livermore is developing methods to streamline this workflow so print-ready instructions are included as part of the optimization output.
One aspect of this effort is exploring the effectiveness of digital twins—computational models of 3D printers that run real-time simulations of the printing process in parallel with the physical process itself. The physical printer continuously passes geometric data to the digital twin to track how accurately the printer is building a part and to simulate the printed part’s performance to gauge if it is acceptable. Through the scalability inherited from the MFEM (Modular Finite Element Methods) library, Smith uses highly resolved meshes to model the as-built geometry. This approach can be used to refine print process parameters and make the printing more reliable. “The toolpath changes directions all the time, and when it does, large accelerations that might throw the printing path off can occur,” says computational engineer Seth Watts. “By designing to minimize those accelerations or at least reduce them to acceptable levels, we achieve a more reliable print.”
In one collaborative project with AM experts, the team is using this functionality to design mechanical digital logic gates, devices that perform basic computations mechanically instead of digitally for operating in extreme or zero-power environments. “These devices are complex machines, and finding designs that do what we want through sheer intuition alone is challenging,” says Talamini. “Furthermore, we’re able to find these designs without having to spin up a new code every time. LiDO and Smith enable us to stand up something quickly using the modularity and automatic differentiation that these two codes offer.”
Smith, in particular, was designed for agility to help the Laboratory respond to national security challenges faster. The code is built such that it does not matter what derivatives it is computing—in theory, researchers can provide any design variables for their bespoke problems. “The inputs are all partial differential equations that use finite element methods, but Smith’s underlying infrastructure is completely agnostic to what equations those are,” says Watts.
Smith also leverages and centralizes the powerful capabilities of RADIUSS (Rapid Application Development via an Institutional Universal Software Stack), functioning as something of a software library. “Much of Smith’s development was a software engineering challenge,” says Bramwell. “With the wealth of RADIUSS tools available to us, we asked ourselves: If we can put them together in a well-defined software ecosystem, could we then build models faster for engineering codes?”
Natural Partnership
CDO works alongside Lawrence Livermore’s Center for Engineered Materials and Manufacturing (CEMM), the Laboratory’s hub for AM innovation, and an important relationship between the two organizations continues. “Design tools that generate complex, nonintuitive structures pair extremely well with AM because only these innovative manufacturing processes can produce such complicated structures,” says Swartz. CDO’s connections with CEMM experts have helped expand its influence and network.
One ongoing project is quantifying printability limitations and enforcing them as constraints in optimization. Certain materials and layouts exist that cannot be printed even with the most advanced AM technologies, and lack of printability is often a key reason an optimized design fails. “We want to give stakeholders a design that simply prints with no complications,” says Swartz. “The outcome is better for everyone if we can directly encode printability in the optimization rather than trying to make optimized designs printable after the fact.”
Structural design is the classic optimization application, but the techniques can also be applied to AM to improve the 3D-printing technology itself. Watts has helped determine optimal process parameters for several printing methods: projection microstereolithography, electrophoretic deposition AM, and tomographic volumetric AM. Barerra Cruz, meanwhile, has optimized printing paths—the path the print nozzle follows while fabricating a structure—to simultaneously maximize the performance of stimuli-responsive liquid crystal elastomer structures.
Another challenge is translating optimized designs into printable instructions for 3D printers. For large lattice structures, the optimized design is described by a surface mesh with millions of data points that take multiple steps to distill into instructions the printer can process. The team is working on ways to streamline this workflow so print-ready instructions are included as part of the optimization output.
New Iterations for New Applications
At any given time, CDO is involved in a variety of Laboratory Directed Research and Development (LDRD) projects and programmatic application-focused work. Common designs they contribute to include mass mocks (developing a structure with the same mass properties using different materials for flight tests) and mechanical mounts for systems operating in extreme environments. One project Swartz worked on involved designing a mount for a telescope lens whose focal length experiences undesirable changes when exposed to temperature variations. The optimized mount passively compensates for these temperature variations, ensuring the system remains in focus without requiring active control systems.
Originating in Livermore’s Engineering Principal Directorate, LiDO and Smith are engineering-mechanics-centric codes first, but researchers are working to expand capabilities and applications areas. CDO’s affiliates have used their tools to design and print electrochemical systems for batteries, capacitators, and thermal-fluidic systems such as heat exchangers, and to develop actuation plans to control robots and other AM devices.
Collaborations with external groups and within Lawrence Livermore often drive new capabilities. The team attends conferences, workshops, and Laboratory events to learn about the problems people face and to anticipate which new capabilities to add. Swartz enjoys collaborations because of the opportunity to exchange knowledge with experts in other fields. “The work is fun because I’m teaching others about optimization and what our team can do, and they’re teaching me what their needs are, which is driving some of LiDO’s and Smith’s development,” he says. “We know we need customers to survive, so if a needed capability aligns with our goals and we have the bandwidth, we try to implement it.”
The LiDO team’s biggest focus is publicity to expand its reach and find new applications and new users, particularly in the Laboratory’s mission-related work. One of Watts’s ideas is developing LiDO “applets,” small, user-friendly instances of the code that solve common design optimization problems across the Laboratory, such as designing mass mocks.
For the Smith code, the goal is to develop a full ecosystem of automatically differentiable finite element codes that simulate all the phenomena engineers and phycists require to model and ultimately optimize their systems. Some of the physics capabilities they have already added can be used to model helium-cooled tritium breeders for inertial fusion energy, electrochemistry systems, gas flow devices, and alloy melting and solidification during casting.
The software teams work closely together to make sure LiDO and Smith are compatible with one another and with other software at the Laboratory as they evolve. “Our teams collaborate so we can ensure that as big decisions are made, both codes will remain compatible with each other,” says Swartz. “We both need this relationship to work for us to survive.” As LiDO and Smith continue their evolution, the goal is not to make two all-encompassing programs, but to build tools that enrich Livermore’s software ecosystem. “We’re not going to solve all the Laboratory’s problems in Smith, but I think we’re learning lessons that can feed into a lot of other efforts and developing ideas that can gain uptake into other codes,” says Talamini.
Paradigm Shift
LiDO and Smith have the potential to significantly accelerate the design cycle at Livermore and to deliver innovative new designs for the Laboratory’s research and national security missions. At the same time, Livermore needs to maintain confidence in its designs, which means analyzing their performance using trusted, more mature codes that have been formally verified and validated. “We can treat LiDO as a magic design generator,” says Watts. “Users don’t have to simply trust our design—they can check it with their own code.”
The teams are working diligently to understand the problems that programs face and ensure their tools maintain compatibility with existing, established workflows. Many current designs are defined in computer-aided design geometry instead of finite element meshes used by LiDO and Smith, so optimization engineers have begun working on design exercises and technology demonstrators with colleagues in the programs to understand how best to reinsert their optimized designs into the production workflows. Tortorelli believes this work is worth the investment. “Our codes can significantly help the programs by designing things better and more efficiently,” he says.
Realizing the codes’ potential requires changing how people at the Laboratory approach design problems. Talamini says, “Instead of guessing a design and checking it with a simulation, we want people to take a step back and think about what it is they are trying to achieve with their design.” Tortorelli adds, “I want people at the Laboratory to think formally about the design process and formalize the design concept. By doing so, we can tell how close they are to meeting their needs, quantify aspects of the design using these automated tools, and achieve the end goal in an efficient manner.”
Tortorelli measures the group’s success in the emergence of “optimization” as a key word at Lawrence Livermore. Laboratory Director Kimberly Budil has named design optimization as one of Livermore’s most important areas of research for the future. Using a fundamentally different approach from the gradient-based LiDO–Smith design optimization approach, the CDO team was instrumental in the success of the DarkStar project (see S&TR, October/November 2024, Beginning at the End), which developed novel machine learning–based techniques to optimize hydrodynamic events. Tortorelli estimates that nearly half of all new LDRD projects are related to optimization.
Success is further measured in connections across Livermore and externally through publishing journal articles, attending conferences, and recruiting doctoral students from top universities. “I think we’re getting a big footprint for international recognition of the optimization work that we’re doing here,” says Tortorelli, offering an example of a recent optimization conference in Kobe, Japan, where CDO sent 10 staff. “We were the largest U.S. contingent there, and people in the optimization community now know Livermore and see all the papers that we publish, where 10 years ago, we were nonexistent.”
That reputation has helped Livermore build the exceptional team of optimization experts it has today. The team’s innovations make LiDO and Smith world-class design tools, and while transforming design at the Laboratory will take work, they feel they are up to the challenge. “We have one of the preeminent design optimization communities anywhere in the world,” says Talamini. “Building these kinds of tools and bringing them to the point of answering real-life questions the way we do takes an extraordinary depth of knowledge across a broad range of disciplines and close collaboration among many people.”
—Noah Pflueger-Peters
For further information contact Dan Tortorelli (925) 423-5313 (tortorelli2 [at] llnl.gov (tortorelli2[at]llnl[dot]gov)).




