Discover
/
Article

Focus on better climate models

MAR 26, 2010

Three federal agencies have announced formation of a collaborative research program aimed at developing more powerful computer models capable of predicting the regional impacts that will result from global warming. The new grant program—to be funded at $50 million annually for five years by NSF, the Department of Energy (DOE), and the US Department of Agriculture (USDA)—also aspires to create models detailed enough to allow predictions to be made on a decadal basis, instead of the century-long scale that is typical of today’s simulations.

The Decadal and Regional Climate Prediction Using Earth System Models (EaSM) program hopes to produce high-resolution models to guide decision makers in addressing the impacts that will be caused by a warming climate. While NSF (which is contributing $30 million per year) will manage the initial review of proposals to the joint solicitation, DOE (which has committed $10 million per year) and USDA (which will provide $9 million per year) will select which of the reviewed proposals they will fund. Arden Bement, NSF’s outgoing director , told reporters that his agency is particularly interested in developing models that take into account the influences of living systems and project how living systems will respond and adapt to climate change. “People live in regions, not on the global median. In order for decisionmakers to plan for change over the next 10 to 20 years, we must be able to predict how climate change will impact their regions over the next 10 to 20 years,” Bement said.

William Brinkman, director of DOE’s Office of Science , said the agency “has been on the vanguard of climate modeling,” and has “added enormously” to its modeling and simulation capabilities in recent years. Oak Ridge National Laboratory boasts the fastest computer in the world, according to one widely recognized independent scorecard. And climate research facilities at DOE labs have been upgraded with recovery act funds. Using computers such as ORNL’s Jaguar, which can achieve petaflop speeds, climate modelers could finally make headway in understanding the roles that aerosols and clouds play in the climate change equation, Brinkman said. “But we also need to improve our understanding so we can put the proper things into the models as a function of time. It’s important to have both these goals in mind,” he added.

Roger Beachy, director of USDA’s National Institute of Food and Agriculture , said USDA wants to develop climate models that can be linked to crop, forestry, and livestock models. Such models will be used to help assess possible risk management strategies and projections of yields at various spatial and temporal scales.

Two types of interdisciplinary proposals will be considered for EaSM funding. Capacity and community building activities that address one or more goals, and last up to three years, are eligible to receive awards of up to $300,000 annually. A second type of proposal should describe large collaborative and interdisciplinary efforts that advance Earth system modeling on regional and decadal scales, and last three to five years; these proposals may receive $300,000 to $1 million in annual funding.

If there ever is an issue that’s going to be hugely impacted if we have to do something about climate, it is energy,” Brinkman declared. “The knowledge base we’re trying to create [with EaSM] will play an absolutely essential role in understanding what we’re going to have to do in the future to remediate this situation.”

Brinkman and Bement stressed that EaSM is only one small component of their respective agencies’ programs in climate modeling. DOE, said Brinkman, has “worked hard” to get to petaflop-level computation, and expects to move soon to the 10-20 petaflop level. “We would love to be able to move to the exascale level—another 1000 times faster—but there are major challenges to doing that. “We believe that climate modeling is probably the driving force to continue in that direction, more than any other modeling that we can think of.”

But Bement cautioned that “it’s not just the heavy metal on the floor; it’s how you use the computing capability.” More sophisticated visualization equipment and mathematical algorithms are needed to interpret the models, while improved application software is required in order to operate at higher capacities. “It also takes education of the science community in how to use these computational tools,” he said. “One has to look at this as a total cyberinfrastructure problem, to include how to deal with the massive amounts of data, in terms of retrieving it and synthesizing it, as well as archiving it for future use.”

David Kramer

More about the authors

David Kramer, dkramer@aip.org

Related content
/
Article
/
Article
The availability of free translation software clinched the decision for the new policy. To some researchers, it’s anathema.
/
Article
The Nancy Grace Roman Space Telescope will survey the sky for vestiges of the universe’s expansion.

Get PT in your inbox

pt_newsletter_card_blue.png
PT The Week in Physics

A collection of PT's content from the previous week delivered every Monday.

pt_newsletter_card_darkblue.png
PT New Issue Alert

Be notified about the new issue with links to highlights and the full TOC.

pt_newsletter_card_pink.png
PT Webinars & White Papers

The latest webinars, white papers and other informational resources.

By signing up you agree to allow AIP to send you email newsletters. You further agree to our privacy policy and terms of service.