Lawrence Livermore National Laboratory



Article title: A Hub for Collaborative Innovation
The artist’s rendering in the background above shows the facility planned for the High Performance Computing Innovation Center (HPCIC), which will include offices, conference rooms, and collaboration spaces. (Photograph by George Kitrinos)

Lawrence Livermore has been a leader in innovative science and technology for more than six decades, working to ensure the nation’s security and delivering solutions to other important national needs. Many of the Laboratory’s accomplishments result from strategic partnerships with private industries and academic institutions. Breakthrough technologies developed through mission-related research are often transformed into products and capabilities that lead to new industries and economic growth.

To broaden the reach of these partnerships, the Department of Energy’s (DOE’s) National Nuclear Security Administration authorized Lawrence Livermore and Sandia national laboratories to develop a collaboration space outside the two laboratories’ fenced perimeters. Established in 2011, the Livermore Valley Open Campus (LVOC) is growing into an innovation hub for unclassified research and development activities. The goals for LVOC are to enhance technology transfer from the national laboratories to the private sector, accelerating the pace of industrial innovation, and to expand the laboratories’ knowledge base through greater commercial and academic interactions.

LVOC is modeled after research parks, which operate with less restrictive security and access controls than are required for the often-classified research at Lawrence Livermore and Sandia. The business and operating principles for the open campus are designed to enhance collaborations with a wide range of organizations, from industrial partners and think tanks to academic institutions and federal, state, and local agencies.

According to Betsy Cantwell, Livermore’s former director for economic development (now at Arizona State University), “LVOC bridges the gap between businesses and the national labs.” She adds, “By working in an unclassified, highly accessible setting, our partners can progress more quickly on new products and bring them to market faster, increasing their profitability.” Collaborations also enhance the laboratories’ national security missions, allowing researchers to develop expertise in new areas. “Partnerships strengthen our research capabilities and help us stay at the forefront of science, technology, and engineering,” adds Cantwell. “We gain industries’ knowledge, and they gain economic value.”

The open campus is being built on a 110-acre parcel along the eastern edge of the two neighboring laboratories. (See S&TR, March 2011, New Campus Set to Transform Two National Laboratories.) Each laboratory has established an “anchor tenant”: Sandia’s is the Combustion Research Facility, and Livermore’s is the High Performance Computing Innovation Center (HPCIC). Livermore’s facility provides offices both for LLNL staff and partners, along with conference and classroom facilities. Plans for the next decade include adding conference space, collaboration facilities, and a visitor’s center.

HPCIC has become a hub of collaborative activity and the venue for various workshops, such as the 24-hour “hackathons” sponsored by Livermore’s Computation Directorate to encourage collaborative programming and creative problem solving by employees and students at LLNL. (Photograph by Meg Epperly.)
HPCIC has become a hub of collaborative activity and the venue for various workshops, such as the 24-hour “hackathons” sponsored by Livermore’s Computation Directorate to encourage collaborative programming and creative problem solving by employees and students at LLNL. (Photograph by Meg Epperly.)

HPC Leads the Way

High-performance computing (HPC) is integral to every research program at Lawrence Livermore, which is home to some of the world’s most powerful supercomputers. HPCIC was thus an appropriate choice as the Laboratory’s first venture at LVOC. “High-performance computing is part of the Laboratory’s DNA,” says Cantwell. The center opened in 2011 with the goal of cultivating HPC-based collaboration and knowledge exchange in a business-friendly environment. Since then, HPCIC has hosted more than 3,000 events and has received more than 25,000 visitors. “It’s become a hub of collaborative activity and the venue of choice for workshops in a broad range of subjects,” says Cantwell.

The Laboratory is renowned for its HPC resources and expertise. “Having these capabilities is vital to our national security missions,” says computational physicist Frederick Streitz, the HPCIC director. “We continually push to keep our capabilities at the foremost edge of computational innovation. Simulations help researchers gain answers more quickly and with greater accuracy, allowing them to develop a more detailed understanding of the processes and materials they are studying. And that leads to more confidence in the recommendations we pass on to decision makers.”

As a result of that push to expand capabilities, the HPC resources at Livermore and other national laboratories are often more than a decade ahead of many private companies. “It’s not only our supercomputers that offer better performance,” says Streitz, “but also the support systems, software, and expertise we develop in trying to fully exploit these computational resources.” (See the box below.)

HPCIC makes those same benefits available to industrial partners to accelerate their innovation cycles. Collaborative teams include experts from all organizations involved in a collaboration, whether a private company, academic institution, or the Laboratory. Research topics range from simulations for optimizing a “smart,” or interconnected, electrical grid system and for predicting the availability of renewable energy sources to examining laser–plasma interactions and discovering new drugs and treatment options to improve human health.

“There’s a growing consensus worldwide that HPC is a key to accelerating the technological innovation that underpins a nation’s economic vitality,” says Streitz. For example, complex models now allow developers to create virtual prototypes of new devices and work through design iterations. Simulations might reveal a design flaw or show how a workflow can be improved, allowing developers to implement new ideas before building with materials.

A Complete Ecosystem of High-Performance Computing Resources

The High Performance Computing Innovation Center (HPCIC) provides industrial partners access to Lawrence Livermore’s unmatched computing resources. Frederick Streitz, the HPCIC director, says, “A massive supercomputer with one and a half million cores is impressive, but by itself, it’s of limited use. It’s the Laboratory’s computational ecosystem—our hardware, software, and experienced people—that enables many discoveries.”

At Livermore, the high-performance computing (HPC) “ecosystem” includes more than 25 systems for parallel numerical simulations, visualization, and data analytics backed by several massive parallel file systems and storage archives. The Laboratory also has comprehensive software assets for HPC, including numerical libraries, highly scalable scientific application codes, and the supporting system software and tools. “But most importantly,” says Streitz, “we have the necessary team of talent and experience, from system administrators to experts in performance optimization, applied mathematicians, and computer scientists.”

The Catalyst supercomputer is available for exploring data-intensive technologies, architectures, and applications. Developed by a partnership of Cray, Intel, and Lawrence Livermore, this Cray CS300 machine is a resource for the National Nuclear Security Administration’s Advanced Simulation and Computing Program. The Catalyst cluster has 7,776 processors (or compute cores) distributed over 324 nodes, each with nearly a terabyte (1012 bytes) of addressable memory. The nonvolatile memory (or NVRAM) retains files even when the power is off, as on a USB memory stick or an MP3 player. Catalyst allows researchers to explore new approaches to big data analytics and hierarchical memory systems.

For research that requires advanced architectures, industrial collaborators can access Vulcan, an IBM BlueGene/Q system able to process 5 quadrillion flops (or petaflops). It came online in 2013 and consists of 24 racks with a total of 24,576 compute nodes, or 393,216 compute cores. Vulcan’s architecture is identical to that of Sequoia, the 20-petaflop machine dedicated to classified national security research.

“Our broad, robust ecosystem keeps us at the vanguard of computing,” says Streitz. “With it, we have both a long history and a promising future of turning compute cycles into science and solutions.”

HPCIC Director Frederick Streitz (left) meets with Doug East, the Laboratory’s chief information officer, in front of the Laboratory’s Vulcan supercomputer. This IBM BlueGene/Q system can process 5 quadrillion floating-point operations per second. (Photograph by Laura Schulz.)
HPCIC Director Frederick Streitz (left) meets with Doug East, the Laboratory’s chief information officer, in front of the Laboratory’s Vulcan supercomputer. This IBM BlueGene/Q system can process 5 quadrillion floating-point operations per second. (Photograph by Laura Schulz.)


Removing the Hurdles

Streitz notes, however, that private industries may hesitate to adopt HPC systems because of the investment required. “Supercomputers are not ‘plug-and-play’ devices,” he says. “To fully exploit a machine’s capabilities, companies need the expertise to design codes that run efficiently on massively parallel systems.” Computational scientists at Livermore have extensive experience working with HPC systems and have developed a rigorous process for validating the accuracy of the simulated results. By collaborating through HPCIC, private-sector businesses have access to the Laboratory’s knowledge, experience, and unclassified supercomputers to pursue new technologies and manufacturing capabilities.

The selection process at HPCIC ensures that projects match Livermore strengths and industry needs. “Projects must also help us advance the Laboratory’s capabilities,” says computer scientist Deborah May, who manages the center’s business development efforts. May connects promising proposals with the Livermore researchers best suited for those projects. Partnerships are then structured under a formal arrangement, such as a cooperative research and development agreement or DOE’s Strategic Partnership Projects, formerly known as Work for Others. Team members then work together for the project’s duration, focusing their expertise toward finding innovative solutions to complex challenges.

Building a Knowledge Pipeline

HPCIC also fosters long-term strategic partnerships in research areas that will provide value to Laboratory programs and corporate entities. One such partnership, called Deep Computing Solutions, expands on Livermore’s 20-plus-year relationship with IBM in developing HPC systems for stockpile stewardship. Through this effort, computational experts from IBM and Livermore work with collaborators from U.S. industries to accelerate the development of new technologies that will benefit both the nation’s security and its economy.

A partnership with RAND Corporation focuses on developing new capabilities in scalable policy analytic methods. “The RAND partnership with Livermore offers the opportunity to pursue new understanding and potential solutions to even the most intractable current and future policy problems,” says Susan Marquis, RAND vice president for Emerging Policy Research and Methods and dean of the Pardee RAND Graduate School.

Says Jim Brase, deputy associate director for big data in Livermore’s Computation Directorate, “Although there are policy elements to the science work we conduct, the Laboratory doesn’t have the extensive experience in policy that RAND does. Coming together helps us gain knowledge and establish new expertise in an area that is relevant to our mission work.”

In the initial project with RAND, researchers revisited an earlier study that evaluated water management strategies for the Colorado River Basin. The joint team used the University of Colorado Boulder’s RiverWare and RAND’s Robust Decision Making analytical framework on a Laboratory system to model the effects of the policy options considered in the original study as well as additional strategies, producing results within hours. “The RAND simulations demonstrate well how large-scale computation can change the nature of the game,” says Streitz. “Developing the ability to explore complex data sets and decision options at a scale previously impossible could revolutionize the way decision makers, policy analysts, and the research community approach some of today’s most challenging issues.”

Another area of interest for HPCIC involves independent software vendors, which develop and sell the scientific and engineering codes that dominate American industry. “Companies have built entire workflows around these commercial codes,” says Streitz, who also chairs a software working group for the Council on Competitiveness. “Unfortunately, their codes don’t scale to the newer generations of computational platforms available to industry.” HPCIC collaborations with vendors are focused on software evolution to scale, expand, and optimize these codes so they will run on current and future HPC systems. HPCIC provides access to the hardware needed for testing and validating the codes. Collaborators work closely with Laboratory scientists who have the experience required to program at scale, allowing them to more quickly modify their codes.

“Just as our computers are a decade ahead of most of industry, the same can be said of our codes and our programming knowledge,” says Streitz. “We want to help software vendors transition to more powerful computing platforms. If they move forward, industry will follow.”

In an HPCIC strategic partnership, Livermore researchers are working with colleagues from RAND Corporation to explore the use of HPC applications for analyzing options in public policy. At this conference, RAND and Lawrence Livermore held a demonstration for water managers—including the U.S. Department of Interior’s Bureau of Reclamation and the California Department of Water Resources—and others to show how Laboratory HPC could perform a study of the Colorado River Basin in hours instead of the several days required initially, while evaluating additional strategy options at the same time. (Photograph by George Kitrinos.)
In an HPCIC strategic partnership, Livermore researchers are working with colleagues from RAND Corporation to explore the use of HPC applications for analyzing options in public policy. At this conference, RAND and Lawrence Livermore held a demonstration for water managers—including the U.S. Department of Interior’s Bureau of Reclamation and the California Department of Water Resources—and others to show how Laboratory HPC could perform a study of the Colorado River Basin in hours instead of the several days required initially, while evaluating additional strategy options at the same time. (Photograph by George Kitrinos.)

Collaborations Across Industries

HPCIC partnerships also offer access to Livermore research centers such as the Turbulence Analysis and Simulation Center (TASC). Managed by the Engineering Directorate, TASC applies advanced numerical methods to simulate and analyze turbulent mixing and reactive flows. These simulation capabilities are important in research to design jet, rocket, and internal combustion engines. Reactive flow models can also be used to predict weather patterns, the atmospheric dispersion of pollutants, and the availability of wind power for electricity generation.

In 2012, engineers from TASC collaborated with General Electric Global Research on simulations to improve the efficiency of jet engines as part of the Laboratory’s HPC4energy initiative. Under HPC4energy, six U.S. energy industries partnered with Livermore computational scientists to demonstrate the potential of advanced computing to provide solutions to energy and environmental challenges. In addition to General Electric Global Research, HPC4energy projects involved GE Energy, Potter Drilling, United Technologies, ISO New England, and Bosch. (See S&TR, June 2012, Incubator Busy Growing Energy Technologies, and S&TR, June 2013, Scaling Up Energy Innovation through Advanced Computing.) Eugene Litinov, senior director of Business Architecture and Technology at ISO New England, noted that access to the Laboratory’s computing capabilities allowed his company to “think differently about problems. You can ask questions you didn’t think of asking before.”

In another energy-related collaboration, Livermore and IBM computational experts helped Energy Exemplar Corporation to simulate the nation’s electric power grid. This research led to a massively parallel implementation of Energy Exemplar’s energy market simulation software, increasing the software’s performance a thousandfold. “Our work with Energy Exemplar served as a demonstration of the capability that can be developed by applying HPC to the challenges of our energy system,” says Streitz. “Ensuring the efficiency, reliability, and safety of the energy grid are national security challenges.”

A partnership with Cymer Corporation is applying codes originally developed for experiments at the National Ignition Facility to help the company develop an extreme ultraviolet (EUV) light source. The Cymer light source will use laser-heated tin droplets to generate EUV light for creating computer circuits on silicon wafers. The collaborators are working to shorten the time required to develop the new light source, which will enable chip manufacturers to increase the number of transistors on a computer chip and thus significantly improve the performance of future supercomputers. 

“Using our codes to simulate the conditions in Cymer’s experiments resulted in a more robust and broadly applicable modeling capability for us,” says Livermore computational physicist Steve Langer, who led the Cymer project. “We also gain a lot of satisfaction from applying our expertise to activities that have a direct impact on everyday life.”

Oil and natural gas service provider Baker Hughes and Livermore geoscientists are teaming to better predict the formation of hydraulic fractures for shale oil and gas production. A new supercomputer code being developed will include the important physical processes that control fractures in rocks. In a related project, the Laboratory is partnering with a company to speed up simulations of fluid flow in oil and gas reservoirs, which will improve business decisions for oil and gas development.

The Cardioid code, developed by a team of IBM and Lawrence Livermore computational scientists, replicates the electrophysiology of the human heart. This snapshot from a Cardioid calculation shows a heart immediately following a heartbeat, during which electrical excitation travels through the heart’s cells. The recovery of cells to their resting voltage (blue) from this excited state (red) varies from region to region. (Image courtesy of IBM.)


The Cardioid code, developed by a team of IBM and Lawrence Livermore computational scientists, replicates the electrophysiology of the human heart. This snapshot from a Cardioid calculation shows a heart immediately following a heartbeat, during which electrical excitation travels through the heart’s cells. The recovery of cells to their resting voltage (blue) from this excited state (red) varies from region to region. (Image courtesy of IBM.)

Biomedical Payoffs

Biomedical research also benefits from the incorporation of HPC into the discovery process. Under the Laboratory’s Medical Countermeasures Program, Livermore scientists and colleagues from a pharmaceutical company ran HPC codes to accelerate development work on a new antibiotic to treat infections. Simulations helped researchers screen new compounds, create detailed maps of target proteins that drugs will bind to, and predict the physiochemical properties of the compounds as well as potential side effects. This collaboration resulted in a patent application for a broad-spectrum antibiotic that was developed in only three months. Modeling has also proven effective for designing new drug delivery platforms and developing vaccines and antimicrobials.

A biomedical effort between IBM and LLNL produced Cardioid, the most realistic simulation of a beating human heart ever developed. (See S&TR, September 2012, Venturing into the Heart of High-Performance Computing Simulations.) The supercomputer code driving Cardioid models the heart’s electrical system, the current that originates from and travels through the heart and causes it to beat and pump blood. This breakthrough simulation capability holds promise in helping researchers better understand how the heart works and responds to different medicines—important information for discovering new drugs and patient-specific therapies to treat cardiovascular disease and improve heart health.

Livermore’s Catalyst supercomputer, designed for data-intensive computing, is available for research collaborations with U.S. industries and academic institutions. (Photograph by Laura Schulz.)
Livermore’s Catalyst supercomputer, designed for data-intensive computing, is available for research collaborations with U.S. industries and academic institutions. (Photograph by Laura Schulz.)

Data-Intensive Computing

An important new area for partnerships focuses on data-intensive computing, in which data ranging from terabytes to petabytes (1012–1015 bytes) in volume are processed. “We believe that advancing ‘big data’ technology is a key to accelerating innovation,” says Brase. “Over the next decade, global data volume is likely to grow to more than 35 zettabytes—that’s 35 trillion gigabytes. We want to figure out how to extract value from this wealth of raw information so we can better inform decision makers.”

HPCIC is expanding the number of partnerships involving data-intensive science and analytics, with simulations performed on Livermore’s Catalyst supercomputer. With Catalyst, researchers can store enormous reference databases in memory and run expansive analyses with more detail and resolution than previously available. Example projects include analyzing the genomes of disease-causing microbes, extracting predictive health indicators from a hospital database, and testing complex models to improve video searches.

Looking Ahead

According to Cantwell, LVOC represents a “sea change” for Lawrence Livermore. “HPCIC has expanded the breadth of collaboration opportunities for Lawrence Livermore while providing tangible benefits to companies.”

She adds that the Laboratory is benefitting from industry as well, often in areas that are central to national security, such as cybersecurity, energy generation, advanced manufacturing, and bioscience. “Collaborations with Bay Area industries help us stay abreast of developments in these rapidly advancing fields.”

Camille Bibeau, assistant to the Laboratory’s director of economic development, is exploring ways to revitalize and expand Lawrence Livermore’s open-campus infrastructure. For example, she is working with the University of California (UC) to renovate Hertz Hall—formerly home to the Department of Applied Science at UC Davis­—to include UC-wide programs in areas of mutual interest.

Bibeau is also exploring potential third-party financial arrangements to build a permanent home for HPCIC, which is currently housed in a temporary modular building. DOE approval is pending on a proposed 100,000-square-foot facility targeted for completion in 2019. The new facility will be eight times larger than the temporary facility and accommodate up to 400 tenants. As HPC collaborations continue to grow, scientists and engineers from industry and academia will benefit from working side-by-side with Livermore staff.

“Challenges include presenting low-risk business models that DOE will accept for building new infrastructure on campus,” Bibeau says. Under one proposed arrangement, DOE would lease land to a third party, which would build the facility. The Laboratory would then lease the facility and over time provide space to its strategic partners. With new facilities in operation, Bibeau foresees LVOC hosting international conferences and workshops and expanding educational programs for local communities and for the nation.

Some of the strongest support for LVOC comes from businesses and governments in the Tri-Valley area, which includes the cities of Livermore, Pleasanton, Dublin, San Ramon, and Danville. In that vein, Bibeau is working closely with the Tri-Valley’s startup incubator i-GATE, one of twelve Innovation Hubs (or i-Hubs) designated by the State of California to foster the growth of technology-oriented companies. Bibeau helped found i-GATE, which is funded by Tri-Valley governments and private donations.

“Interest in collaboration opportunities with Lawrence Livermore is increasing,” says Bibeau. With LVOC and HPCIC, the Laboratory is developing an innovation pipeline that will contribute to national security and U.S. economic competitiveness.

—Arnie Heller

Key Words: data-intensive computing, Cardioid code, Catalyst supercomputer, high-performance computing (HPC), High Performance Computing Innovation Center (HPCIC), Livermore Valley Open Campus (LVOC), Vulcan supercomputer.

For further information contact Frederick Streitz (925) 423-3236 (streitz1@llnl.gov).