DOE acquiring new supercomputers and climate models
DOI: 10.1063/PT.3.2545
The US Department of Energy has awarded a three-year, $54 million grant to a 14-member consortium of national laboratories, universities, and other entities to develop climate models for high-performance computers that don’t yet exist. The project is designed to accelerate the development of Earth system models that will improve projections of three specific components of climate change: water-cycle changes, biogeochemical feedbacks, and the collapse of Antarctic ice sheets.
The consortium, called Accelerated Climate Modeling for Energy (ACME) will develop climate software to be run on next-generation supercomputers that are scheduled to be installed at Oak Ridge and Argonne National Laboratories in 2017. Contracts for development of the machines are due to be awarded within the next few months. Since their computing architectures haven’t been determined, the ACME group will need to develop flexible and adaptable code, says David Bader, the principal investigator and a Lawrence Livermore National Laboratory (LLNL) atmospheric scientist.
The models to be developed by ACME could not be operated at the required throughput level on DOE’s most powerful civilian computers—Oak Ridge’s 27-petaflops Titan, built by Cray, and Argonne’s 10-petaflops Mira, made by IBM—says Bader (a petaflops is a thousand trillion floating point operations per second). Instead, the software will be designed to run on the three new machines, each of which will have a peak performance of at least 100 petaflops. “The idea is that the models will be ready when the machines are,” he says. The world’s leading supercomputer, with a top speed of 55 petaflops, is China’s Tianhe-2.
The TITAN computer at Oak Ridge National Laboratory is widely recognized as the second most powerful in the world. The US Department of Energy announced a new effort to develop climate models that can be run on even higher-performance computers due to be installed at ORNL and Argonne National Laboratory in 2017.
OAK RIDGE NATIONAL LABORATORY
To minimize the risk of technological failure and to cover a broad set of applications, different computing architectures are being procured for the DOE Office of Science’s leadership computing facilities at ANL and ORNL. DOE’s National Nuclear Security Administration will select one of those designs for a third machine to be installed at LLNL; that computer will be devoted mainly to nuclear weapons work.
Later versions of the ACME models will take full advantage of the exascale (1000-petaflops) computers when they become available. Bader says that although architectures of exascale computers may be decided in six years, having a working machine that soon is unlikely. Acting Office of Science director Patricia Dehmer testified before Congress earlier this year that the acquisition of an exascale computer having up to 1000 times the computational power of the current top performers by the early 2020s is her office’s “highest priority.” Two to three generations of computers will be acquired in the intervening years, she added.
The ACME project will involve about 150 individuals. Other participating DOE laboratories are Brookhaven, Lawrence Berkeley, Los Alamos, Pacific Northwest, and Sandia. Additional partners are the National Center for Atmospheric Research (NCAR), Scripps Institution of Oceanography, the University of Maryland, New York University, Kitware Inc, and the University of California, Irvine. The program is expected to run for 10 years, with DOE program reviews occurring at six-month intervals, Bader says.
The ACME program streamlines and reduces overlap among smaller groups at DOE that were developing different components or specific capabilities of climate models to investigate such topics as climate feedbacks in the Arctic, drought, permafrost thaw, and other potential abrupt climate changes, says Dorothy Koch, Earth system modeling program manager in the Office of Science’s Biological and Environmental Research office. Each of ACME’s three science objectives has elements that will explicitly address DOE mission requirements, she notes.
The narrowly focused ACME model is building on the more general Community Earth System Model (CESM) being developed by NCAR with DOE and NSF sponsorship, says Koch. The CESM, which is being scaled to run on the current ORNL and ANL machines, consists of tandem simulations of the atmosphere, oceans, ice, and land running concurrently. “With ACME being tightly coordinated with CESM and as part of the CESM ‘family of models,’ there is no duplication of effort,” she says.
According to the project plan, the water-cycle portion of the code will simulate changes in the hydrological cycle and will have a specific focus on precipitation and surface water in mountainous regions such as the western US and the headwaters of the Amazon River. For biogeochemistry, researchers will examine how more complete treatments of nutrient cycles affect carbon–climate system feedbacks, especially in tropical systems.
The ACME team will examine the near-term risk for the onset of Antarctic ice-sheet collapse caused by adjacent warming waters. In May scientists reported that in as little as two centuries the melting of a linchpin glacier will inevitably cause the West Antarctic Ice Sheet to collapse into the sea, which will in turn cause sea levels to rise by more than three meters (see Physics Today, July 2014, page 10
More about the Authors
David Kramer. dkramer@aip.org