HPC, Simulation, and Data Science

High-performance computing (HPC) has always been a defining strength of the Laboratory. State-of-the-art simulation applications that run efficiently on the world’s most advanced computers is the integrating element of science-based stockpile stewardship and critical to many other national security needs. These extraordinarily realistic and reliable science and engineering simulations allow modeling and simulation to assume an equal role with experiment and theory.

Livermore is currently siting Sierra, a next-generation supercomputer focused on predictive applications to sustain the nuclear deterrent. Concurrently, Livermore hardware and software computer scientists are helping to prepare for the coming age of exascale computing (systems capable of at least a billion billion calculations a second). They are developing new computer architectures as well as vertically integrating hardware and software, multi-physics applications, and data-science analytics so they run seamlessly at the exascale.

In data science, we’re creating the capabilities to recognize patterns in extreme amounts of information (called big data) in order to understand and predict the behavior of complex systems. We’re applying data science in innovative ways, such as discovering new relationships between human genetic makeup and cancer susceptibility and successful treatment.

Sliver Image
HPC Sliver