HPC, Simulation, and Data Science

High-performance computing (HPC) has always been a defining strength of the Laboratory. State-of-the-art simulation applications that run efficiently on the world’s most advanced computers is the integrating element of science-based stockpile stewardship and critical to many other national security needs. These extraordinarily realistic and reliable science and engineering simulations allow modeling and simulation to assume an equal role with experiment and theory.

Livermore’s flagship supercomputer is Sierra, a next-generation system focused on predictive applications to sustain the nuclear deterrent. Concurrently, Livermore hardware and software computer scientists are helping to prepare for El Capitan, an exascale computing system that will be capable of at least two  quintillion (or two million trillion) calculations per second. We’re also developing new computer architectures as well as vertically integrating hardware and software, multiphysics applications, and data science analytics so they run seamlessly at exascale.

In data science, we’re creating the capabilities to recognize patterns in massive amounts of information (called big data) in order to understand and predict the behavior of complex systems. We’re applying data science in innovative ways, such as discovering new relationships between human genetic makeup and cancer susceptibility and successful treatment.

Sliver Image
HPC Sliver