NNSA touts savings from supercomputing
DOI: 10.1063/PT.3.2176
Recent advances in computational capabilities are cutting as much as $2 billion off the cost of refurbishing and maintaining the US nuclear weapons stockpile, say officials of the National Nuclear Security Administration (NNSA). Now the weapons program has set its sights on exascale computing—three orders of magnitude faster than the current state-of-the-art petascale computers—by around 2020.
Increasingly powerful simulations have contributed to decisions in recent years to use recycled plutonium pits instead of newly manufactured ones in the life-extension programs for several aging weapons systems. Those decisions in turn led the NNSA to postpone construction of a new pit production plant at Los Alamos National Laboratory for at least five years. And simulations recently helped officials determine that a “very expensive” proposed warhead modification wasn’t actually needed to correct a problem in one weapons system, says Donald Cook, the NNSA’s deputy administrator for defense programs.
Exascale computing will allow modelers to simulate with a much finer degree of granularity what happens with weapons components, Cook says. “If you look at the granularity of metals, plastics, and polymers, the limitation on computing now is that … you have to make in the assumption that the material behaves kind of uniformly down below a certain size level.” Current simulations can’t account for grain boundaries in metals, he notes. “Yet if you look at where cracks develop in metals they always develop at the grain boundary. If you look at where corrosion occurs, it’s at a grain boundary. If you look at the effect on materials of aging, you gather a lot of chemical contaminants at the grain boundary.”
Billion-dollar impacts
Dimitri Kusnezov, a senior adviser to Energy secretary Ernest Moniz, told the Secretary of Energy Advisory Board on 13 September that simulations performed this year had helped determine that the lifetime of a particular weapons system is a decade longer than it was believed to be just two years ago. Kusnezov said that the simulations to produce the finding were not available two years ago and that the finding would save the NNSA $2 billion. “Today we make decisions with [billion-dollar] impacts that we could not do just 2–3 years ago,” he stated in his presentation. Cook says the savings achieved with simulations have already been incorporated in the NNSA’s budget request and outyear projections. “It’s not like there was a discovery and all of a sudden we knocked $2 billion off the cost of everything,” Cook says.
Pit reuse has been approved by the NNSA for the refurbished W76, a warhead carried by Trident missiles; for the B-61 bomb; and for the proposed integrated replacement for the W78 and W88 warheads carried by Minuteman land-based and Trident missiles. Further savings would result if the NNSA decides, as is likely, to reuse pits in life extension of W80 warheads carried by air-launched cruise missiles, Cook says.
Based on simulations and experimental results, researchers at Lawrence Livermore and Los Alamos National Laboratories reported last year that pits should function as designed for at least 150 years. That’s up from the 85-year minimum pit life the two labs had estimated in 2006.
Sequoia, an IBM Blue Gene/Q system at Lawrence Livermore National Laboratory, was ranked third in the world in computing performance in June by the widely recognized Top 500 supercomputer sites listing (
LLNL
Keeping cost growth down
But even with pit reuse, the estimated cost of the B-61 life extension has soared to $10 billion, according to a Department of Defense assessment. That’s up from the $4.5 billion estimated just two years ago, before the NNSA and the DOD decided to add more features to the refurbished bomb. Senate appropriators have balked at the increase; their version of a fiscal year 2014 funding bill would cut the NNSA request for the B-61 program. Although House appropriators added to the B-61 request, they, too, expressed concern over its high cost and ordered the NNSA to better justify the program. At press time, FY 2014 appropriations were unresolved.
Still, modeling helps prevent further cost growth on the B-61 program, Cook says. He points to two tests, carried out by engineers at Sandia National Laboratories in August, of the pulsed Doppler radar developed to replace the bomb’s existing vacuum-tube version. Neither the full set of data obtained in each test nor Sandia’s decision that further tests are not required would have been possible without modeling and simulation, he says.
“Simulation is a great boon, but without experiments, you wouldn’t know what to trust in the codes,” says Cook. “So it’s a combination where we’re using simulations to drive the kind of experiments that we’re doing in order to benchmark the simulation.”
Within the limits of reasonable power consumption, which the NNSA defines as 20 MW or less, there is no obvious path to exascale computing, Kusnezov told the Secretary of Energy Advisory Board. A 1-teraflop Intel microprocessor the size of a pack of gum—equal in processing speed to the world’s most powerful supercomputer in 1996—is expected to come on the market within two years, he said. But an exascale machine would require 1 million of them. New programming paradigms, new methods for dealing with massive data sets up to the yottabyte (1024-byte, or trillion-terabyte) scale, and new programming paradigms, said Kusnezov, are among the challenges that will need to be overcome.