As of early on 4 August—shortly after the Obama administration issued new environmental regulations bearing on climate and an executive order to elevate the US to supercomputing preeminence—the search term “Obama emissions” yielded some 9,140,000 results at Google News, but “Obama supercomputer” yielded two orders of magnitude fewer.
The Wall Street Journalcalled the Obama Environmental Protection Agency’s (EPA’s) new regulations “the first-ever federal limits on power-plant carbon emissions,” aiming “to change the way Americans make and consume electricity, accelerating a shift already under way toward cleaner fuels, renewable energy and consumer-generated power.” Another WSJpiece predicted the obvious: a political “firestorm.” And indeed a WSJeditorial quickly appeared under the headline “Climate-change putsch: States should refuse to comply with Obama’s lawless power rule.”
But in the quieter realm of news and comment about the supercomputing executive order—mainly though not only in technology-focused publications—a Washington Post piece began, “President Obama has launched an ambitious technology initiative, a moonshot in the world of supercomputing that could help solve some of the world’s most complex problems.”
The executive order “Creating a National Strategic Computing Initiative” (NSCI) calls for “a whole-of-government effort ... to create a cohesive, multi-agency strategic vision and federal investment strategy, executed in collaboration with industry and academia, to maximize the benefits” of high-performance computing (HPC) for the US. An accompanying press release explained:
HPC has historically focused on using numerical techniques to simulate a variety of complex natural and technological systems, such as galaxies, weather and climate, molecular interactions, electric power grids, and aircraft in flight. The largest of these machines are referred to as supercomputers. One measure of supercomputer performance is flops, or floating-point operations per second, indicating the number of arithmetic operations performed each second. Over the next decade the goal is to build supercomputers capable of one exaflop (1018 operations per second). It is also important to note that HPC in this context is not just about the speed of the computing device itself. As the President’s Council of Advisors on Science and Technology has concluded, high-performance computing “must now assume a broader meaning, encompassing not only flops, but also the ability, for example, to efficiently manipulate vast and rapidly increasing quantities of both numerical and non-numerical data.”
The exascale initiative actually does bear importantly on the issues addressed by the new EPA regulations, as the release’s mention of climate suggested, and as did the opening lines of both the summary and the text of Science magazine’s 2012 news article “What it’ll take to go exascale.” But the press release, from a White House that well knows the flammability of climate technopolitics, didn’t highlight climate. Instead it emphasized the implications of exascale computing for computational fluid dynamics in aeronautics and for precision medicine. Along that medical line, BBC quoted Richard Kenway of the University of Edinburgh concerning exascale and the development of personalized medicines tailored to individuals: “Today, drugs are designed for the average human and they work OK for some people but not others. The real challenge in precision medicine is to move from designing average drugs to designing drugs for the individual because you can know their genome and their lifestyle.”
Much of the exascale coverage takes an optimistic view, including about the often assumed, yet increasingly questionable, future of Moore’s law. But as Computerworldpointed out, almost quoting verbatim the executive order itself, “One of the objectives of the NSCI will be to provide a viable path over the next 15 years, even after the limits of current semiconductor technology are reached in the ‘post-Moore’s law era.’”
And then there’s the energy-consumption issue. The Washington Post quoted J. Steve Binkley, associate director of the Department of Energy’s Office of Advanced Scientific Computing Research: “If you scale current technology up to exascale levels, it would be up to the range of a nuclear powerplant just to run one computer.” The Post also reported that “Binkley believes the initiative can hit the exascale target in roughly a decade, but that depends on funding and overcoming some technical challenges.”
The UK technology website the Inquireremphasized that the “US exaflop supercomputer will be too inefficient to change the world.” The site added that the “UK supercomputer and optical computing firm Optalysys believes that, if Obama is to have a chance of achieving his target, he needs to take a novel approach, as his proposed scheme is ‘massively inefficient.’” The UK information-technology site IT Proasked, “Will Obama’s supercomputer be obsolete by 2025?”
So when will we have an exascale supercomputer? That question headlined an IEEE Spectrumarticle that appeared more than seven months before the executive order. The subhead answered, “2023 if we do it right; tomorrow if we do it crazy.” Development engineers will “need new computer architectures capable of combining tens of thousands of CPUs and graphics-processor-based accelerators,” the article predicted. They will need “to deal with the growing energy costs required to move data from a supercomputer’s memory to the processors,” while software developers “will have to learn how to build programs that can make use of the new architecture.” The article also quoted Steve Scott, senior vice president and chief technology officer at Cray. “You could build an exaflop computer tomorrow, but it’d be a crazy thing to do because of the cost and energy required to run it.”
---
Steven T. Corneliussen, a media analyst for the American Institute of Physics, monitors three national newspapers, the weeklies Nature and Science, and occasionally other publications. He has published op-eds in the Washington Post and other newspapers, has written for NASA’s history program, and is a science writer at a particle-accelerator laboratory.
Unusual Arctic fire activity in 2019–21 was driven by, among other factors, earlier snowmelt and varying atmospheric conditions brought about by rising temperatures.
January 06, 2023 12:00 AM
Get PT in your inbox
Physics Today - The Week in Physics
The Week in Physics" is likely a reference to the regular updates or summaries of new physics research, such as those found in publications like Physics Today from AIP Publishing or on news aggregators like Phys.org.