Focus on better climate models
DOI: 10.1063/PT.4.0946
Three federal agencies have announced formation of a collaborative research program aimed at developing more powerful computer models capable of predicting the regional impacts that will result from global warming. The new grant program—to be funded at $50 million annually for five years by NSF, the Department of Energy (DOE), and the US Department of Agriculture (USDA)—also aspires to create models detailed enough to allow predictions to be made on a decadal basis, instead of the century-long scale that is typical of today’s simulations.
The Decadal and Regional Climate Prediction Using Earth System Models
William Brinkman, director of DOE’s Office of Science
Roger Beachy, director of USDA’s National Institute of Food and Agriculture
Two types of interdisciplinary proposals will be considered for EaSM funding. Capacity and community building activities that address one or more goals, and last up to three years, are eligible to receive awards of up to $300,000 annually. A second type of proposal should describe large collaborative and interdisciplinary efforts that advance Earth system modeling on regional and decadal scales, and last three to five years; these proposals may receive $300,000 to $1 million in annual funding.
If there ever is an issue that’s going to be hugely impacted if we have to do something about climate, it is energy,” Brinkman declared. “The knowledge base we’re trying to create [with EaSM] will play an absolutely essential role in understanding what we’re going to have to do in the future to remediate this situation.”
Brinkman and Bement stressed that EaSM is only one small component of their respective agencies’ programs in climate modeling. DOE, said Brinkman, has “worked hard” to get to petaflop-level computation, and expects to move soon to the 10-20 petaflop level. “We would love to be able to move to the exascale level—another 1000 times faster—but there are major challenges to doing that. “We believe that climate modeling is probably the driving force to continue in that direction, more than any other modeling that we can think of.”
But Bement cautioned that “it’s not just the heavy metal on the floor; it’s how you use the computing capability.” More sophisticated visualization equipment and mathematical algorithms are needed to interpret the models, while improved application software is required in order to operate at higher capacities. “It also takes education of the science community in how to use these computational tools,” he said. “One has to look at this as a total cyberinfrastructure problem, to include how to deal with the massive amounts of data, in terms of retrieving it and synthesizing it, as well as archiving it for future use.”
David Kramer
More about the authors
David Kramer, dkramer@aip.org