On the newest episode of the Big Ideas Lab podcast, listeners will go behind the scenes of Lawrence Livermore National Laboratory's (LLNL) latest groundbreaking achievement: El Capitan, the world's most powerful supercomputer. Listen on Apple or Spotify.
Built by Hewlett Packard Enterprise (HPE) with AMD's new Instinct MI300A Accelerated Processing Units (APUs), El Capitan is designed to operate on an unprecedented scale for the National Nuclear Security Administration's Tri-Labs (LLNL, and Los Alamos and Sandia National Laboratories). Capable of performing more than 2 quintillion calculations per second, El Capitan is a behemoth more than 20 times faster than LLNL's previous most powerful supercomputer Sierra, putting it squarely at the forefront of modern supercomputing.
In the episode, El Capitan experts, including LLNL's Associate Director for Weapon Simulation and Computing Rob Neely, Chief Technology Officer for Livermore Computing Bronis de Supinski and Associate Program Director for Computational Physics in Weapon Simulation and Computing Teresa Bailey explain the significance of the machine, and exascale computing, to the nuclear security enterprise and beyond.
"The sheer number of calculations that you can perform in a fixed amount of time is beyond anything that we've been able to do in the past," Neely said.
A significant focus of the episode is how El Capitan is central to LLNL's mission to maintain the safety and reliability of the U.S. nuclear stockpile. This shift to exascale will enable scientists to perform simulations for the NNSA's Stockpile Stewardship Program with a precision that was previously impossible, which will play a vital role in U.S. national security.
"Prior to 1992, we would go off to Nevada, drill a big hole in the ground, put the weapon down there and set it off," Neely said. But with the end of such tests, NNSA had to innovate. "That really spearheaded a big push in the United States to use supercomputing as one leg of a new tool called Science-Based Stockpile Stewardship, designed to make sure we could retain our confidence in these weapons."
El Capitan's unmatched computational precision allows LLNL to conduct virtual tests that ensure the country's nuclear deterrent remains reliable. El Capitan will allow scientists to perform these simulations with an unprecedented level of detail and accuracy, ensuring the U.S. can maintain its deterrence capabilities without actual testing.
El Capitan's design and development required unique blend of government and industry collaboration. Bailey explains that El Capitan embodies a vision that dates back more than 25 years to the Accelerated Strategic Computing Initiative (ASCI), which aimed to create supercomputers for maintaining the nuclear stockpile without underground tests.
"The ASCI program was designed to deliver modeling and simulation tools aimed at stockpile stewardship. using high performance computers so that we would never have to go back to nuclear testing, so El Capitan really represents that end product for the original vision of ASCI," Bailey said.
The immense power of El Capitan extends beyond national security applications, as advances and knowledge gained by using the machine will impact broader scientific breakthroughs, making it a crucial resource not just for national security but for understanding and addressing pressing global challenges.
"There are problems I can imagine. They're big problems, they're things that no one has ever dreamed of really trying," Neely said. "They're probably not going to be the first thing we try, but over the life of the machine, we will take a shot. I'm very certain of that."
The episode concludes with a look to the future. As Neely, de Supinski and Bailey view El Capitan as the first step in a new era of computational capability and evidence of the nation's continuing leadership in supercomputing, with future systems already in the planning stages.
To hear more about how El Capitan is shaping the future of national security and scientific research, listen on Apple or Spotify.