24 000 years of climate change, mapped
DOI: 10.1063/PT.3.4914
Earth’s changing climate is typically summarized with a single number: the average surface temperature. That’s the quantity that has already risen by 1.1 °C since preindustrial times—and that parties to the Paris climate agreement hope to limit to a maximum rise of 2 °C.
But the average surface temperature doesn’t describe how any specific location experiences climate change. Some regions, such as the Arctic, are warming much faster than average, while other regions warm more slowly. Temperatures change differently in different places, and they always have.
Now Matthew Osman and Jessica Tierney (both at the University of Arizona) and their colleagues have reconstructed Earth’s spatially changing temperature tens of thousands of years into the past. Using a technique called data assimilation—a statistical method for melding measurement data with numerical models—they mapped the temperature at 200-year intervals over the past 24 000 years, the entire period since the Last Glacial Maximum.
1
Figure
Figure 1.

Glaciers covered the northern latitudes 24 000 years before present (BP). As they receded, the planet warmed, but not evenly. New research assimilating temperature proxy data with climate modeling has reconstructed the temperature change across both time and space. Changes are relative to the average over the period 1000–1850 CE. (Figure by Matthew Osman.)

The work increases climatologists’ confidence in how present-day climate change fits into historical context. According to the results, never in the past 24 millennia has Earth been warmer than it is today, and never has it warmed faster than it’s warming today. And the vast majority of Earth’s temperature change—even in preindustrial times—is attributable to atmospheric greenhouse gases and to the reduction in albedo that accompanies deglaciation.
Model and measurement
Data assimilation methods are the engine of weather forecasting. Meteorologists have detailed models of atmospheric dynamics, but the models are only as good as their initial conditions. Even with thousands of instruments continuously monitoring the weather around the world, the complete state of the atmosphere is never precisely measured.
In its simplest form, data assimilation works like a weighted average: When a model and a measurement give conflicting values for the same scalar quantity, data assimilation outputs the best estimate of the quantity’s true value that accounts for the respective uncertainties of the model and measurement.
The state of the atmosphere is not a scalar, but it can be represented as a vector with billions of components. For vector quantities, data assimilation is similar but more complicated. Uncertainty is represented by a covariance matrix that describes correlations among the vector components, and the measurement probes only part of the vector state. Still, data assimilation solves for the most likely state given the combined model and measurement information.
Forecast meteorologists use what’s called an online approach to data assimilation: They repeatedly rerun their weather models with newly acquired data to refine their knowledge of the atmosphere’s current state. Then they propagate that state forward to forecast the weather of the next several days.
Tierney, a geochemist with an interest in Bayesian statistical methods, sought to use a similar method to look into the past rather than the future. “The math is actually pretty simple—just a few lines of code,” she says. “But everything that went into it was really complicated.”
Plankton geochemistry
There weren’t any weather stations collecting data 24 000 years ago, so researchers need to rely on temperature proxies to infer the climate at that time. For the recent past, they can use tree rings: Trees grow faster in warm years than in cold ones. (See the article by Toby Ault and Scott St. George, Physics Today, August 2018, page 44
Fortunately, single-celled ocean plankton leave geochemical records of temperature that date back not just thousands but millions of years. (See Physics Today, December 2001, page 16
Many research teams have collected and analyzed plankton proxies over the years. But they’ve all done so for their own purposes—usually to study local climate, not global—so the data haven’t all been stored in any central database, nor even in any standardized format. “The data were all over the place, like in the supplemental parts of research papers from the 1980s,” says Tierney. She and her colleagues had to track them all down, compile and reformat them, account for changing calibration standards, and in some cases correct errors.
The proxy locations are shown in figure
Figure 2.

Ocean plankton preserved in seafloor sediments provide some of the best records of Earth’s past temperature. Shown here are the locations of plankton sampling sites. The circle colors represent the type of geochemical data, and the sizes represent the duration of each temperature record in kiloyears (kyr). Relative to the rest of the world, the Southern Ocean—an important but inaccessible region—is poorly sampled. (Adapted from ref.

The researchers used the same National Center for Atmospheric Research flagship climate model as the Intergovernmental Panel on Climate Change does for its future projections. Even on a supercomputer, modeling 24 000 years of climate evolution took so much computer time that Tierney and colleagues had to replace the meteorologists’ online approach to data assimilation with an off-line one. Rather than incorporating new data at every time step, they ran the model in its entirety and assimilated the data afterward. To check the validity of their reconstruction, they excluded a few measurement records from the assimilation, then checked the withheld data against the output. The temperatures agreed well.
A simple story
A big benefit of having a spatiotemporally resolved climate map is that the researchers can decompose it into its principal components of variance to see where temperature changes are correlated. In the past 24 000 years, warming has been concentrated in the northern latitudes—in northern Europe and present-day Canada—where glaciers dominated the landscape at the time of the Last Glacial Maximum and have retreated the most since then. In fact, more than 90% of the global temperature variance is described by that Arctic-dominated mode, which in turn is almost perfectly correlated with atmospheric greenhouse gas levels and glacial extent.
It’s well established that modern-day climate change is driven by greenhouse gas emissions and compounded by deglaciation. But to see the same effect in play before humans made their mark on the climate is a new and striking result. Plenty of other factors could conceivably have influenced global climate on millennial time scales—including changes in vegetation and windblown mineral dust—but it turns out that they haven’t. “The cleanliness of the signal was a bit of a surprise,” says Tierney. “There may be more complicated things going on at the regional level, but globally, the story is quite simple.”
The data assimilation also sheds new light on the so-called Holocene temperature conundrum, the puzzle of how Earth’s temperature has changed over the past 7000 years. Previous temperature reconstructions, including the red dotted line
2
in figure
Figure 3.

Earth’s warming from 24 000 years before present (BP) until now has not been subtle. But the temperature trend during the Holocene epoch—especially the past 7000 years—is less certain. Previous work, including an estimate created by Shaun Marcott, Jeremy Shakun, and colleagues

The data-assimilation reconstruction (blue) shows the opposite trend—a slight but definite warming across the entire period—more consistent with greenhouse gas records. Through statistical analysis, Osman, Tierney, and colleagues attribute the discrepancy with the proxy-only reconstruction to the undersampling of the Southern Ocean: With so little temperature data from that part of the world, the proxy-only reconstructions may have been filling in the gaps incorrectly.
The data gaps may yet be remedied directly. Temperature records exist in the Southern Ocean sediments, and researchers willing to brave the stormy seas could collect important new pieces of the climate puzzle. “When I’ve given talks to colleagues who specialize in the Southern Ocean, they’ve been pleased to see that the region is so important,” says Tierney. “So maybe this will inspire new research cruises to go down there.”
Digging deeper
One of the main motivations for reconstructing past climate conditions is to look for clues about what’s in store for Earth’s warming future. But by that standard, the time since the Last Glacial Maximum is an imperfect guide: The temperature, rate of warming, and greenhouse gas levels are all higher now than at any other point in that period.
But that wasn’t always the case. Looking back millions rather than thousands of years, one can find plenty of times when Earth was hotter and greenhouse gases were more abundant than they are today. In particular, the Paleocene–Eocene Thermal Maximum, an anomalous temperature spike some 55 million years ago, is an ominous analogue for present-day warming, although the rate of temperature change was still far slower then than it is now.
One goal on the researchers’ minds is to turn their data-assimilation techniques to the more distant past to get a more complete picture of how Earth’s climate behaved—and may behave again—under those extreme greenhouse conditions.
References
1. M. B. Osman et al., Nature 599, 239 (2021). https://doi.org/10.1038/s41586-021-03984-4
2. J. D. Shakun et al., Nature 484, 49 (2012); https://doi.org/10.1038/nature10915
S. A. Marcott et al., Science 339, 1198 (2013). https://doi.org/10.1126/science.1228026