Discover
/
Article

Earth’s land surface temperature trends: A new approach confirms previous results

APR 01, 2013
The newcomers to the task looked at many more weather stations and used a geostatistics technique to adjust for data discontinuities.

DOI: 10.1063/PT.3.1936

People have been measuring and recording temperatures since the 17th century, and scientists have been using those data to estimate a mean global temperature since the late 19th century. As weather stations increased in precision and in global coverage, especially in the last half of the 20th century, they facilitated better estimates of the mean global temperature—a key indicator of possible climate change.

For the past 30 years, those estimates have primarily been done by three independent teams: the Climatic Research Unit 1 (CRU) of the University of East Anglia in the UK, in collaboration with the Hadley Center of the UK Met Office; NASA’s Goddard Institute for Space Studies 2 (GISS); and the National Oceanic and Atmospheric Administration’s National Climatic Data Center 3 (NCDC).

Their task is not a simple one. For decades, researchers have compiled databases of historic temperature records from disparate sources in more than 100 countries. Analysts must correct those raw data for discontinuities in a given station’s time series caused by factors unrelated to climate, such as installation of an instrument, a move to a new location, or changes in recording practices. Next they must sort the global trend from the expected local weather fluctuations. In addition, they must account for the uneven global coverage, with more plentiful data from North America, for example, than from Africa. Despite different approaches to those tasks, the three main groups have produced consistent results: All report an increase in the global temperature over the past century.

Given the importance of the temperature trends to the climate discussion, critics have focused on possible biases from such factors as data selection or the urban heat island effect—the warming experienced by some weather stations as the surrounding area becomes more densely developed. Scientists with the CRU, GISS, and the NCDC have checked and corrected for such biases in various ways, but some critics remained unconvinced.

To address some of the concerns, a fourth group recently took a different approach to the problem. Richard Muller of the University of California (UC), Berkeley, and his daughter Elizabeth started the Berkeley Earth Surface Temperature project under the auspices of the nonprofit Novim Group in Santa Barbara, California; the project’s collaborators include scientists from UC Berkeley, Lawrence Berkeley National Laboratory, and Oregon State University. The results were recently published, 4 although the work has been posted on the group’s website (http://www.berkeleyearth.org ) and publicly discussed for the past year.

The group’s temperature estimates, done so far just for land surface temperatures, agree well with those from previous profiles. (See the figure on page 19.) The Berkeley group extended its analysis back to 1750, a century earlier than in other studies, although the sampling in the early period was poor. It might be useful to compare such data with estimates from various temperature proxies, says Zeke Hausfather of C3 Energy in Redwood City, California. In future work, the Berkeley team plans to incorporate marine temperatures, as the other three groups have done, since oceans represent 70% of Earth’s surface.

PTO.v66.i4.17_1.f1.jpg

Global land surface temperatures, shown as 10-year running averages. The new estimate made by the Berkeley Earth Surface Temperature project is shown in black, with shaded areas representing uncertainties of one and two standard deviations. The curve compares well with land-only averages calculated for times after 1850 by three other groups: the Climatic Research Unit of the University of East Anglia and the UK Met Office Hadley Center (red); NASA’s Goddard Institute for Space Studies (purple); and the National Climatic Data Center of the National Oceanic and Atmospheric Administration (green). Those groups reported temperature anomalies relative to various base periods, so the Berkeley analysts added a constant factor to each to match the absolute calibration shown. (Adapted from ref. 4.)

View larger

Thomas Karl, NCDC director, welcomes the entry of a fourth independent research group that makes different assumptions to analyze the same variable. Gavin Schmidt of NASA’s GISS comments that the particular approach taken by the Berkeley team addresses several specific criticisms: that the prior analyses did not use enough data and that they used flawed procedures for correcting data discontinuities.

Differing approaches

A key distinction of the Berkeley group’s treatment is that it allows analysts to handle short and discontinuous temperature records; the other three groups, by contrast, have relied on stations with fairly long temporal records, usually on the order of decades. No more than about 8000 sites have been included in past analyses. The Berkeley group works with temperature records from about 40 000 sites.

One of the first steps in calculating a global temperature is to homogenize the data—that is, detect and correct for discontinuities in the temporal data set from each station. For example, analysts may compare the time records from adjacent stations, which should be experiencing roughly the same weather conditions. Any jump in one time series not seen in others is flagged, and the dataset is adjusted accordingly.

When the Berkeley collaborators encounter a discontinuity, they break the data set into two records and treat the resulting fragments as independent records from the same location. Such an approach effectively multiplies the number of record fragments they handle by about a factor of four, to something like 170 000. The compensation for the discontinuities—or offsets—is determined by the team’s global statistical treatment.

For any global temperature estimates, analysts must combine individual records into a global average. Typically, they divide the globe into a grid and, for each grid square, estimate the temperature—or more commonly, the temperature anomaly compared with some long-time local average. The different groups adopt different methods for combining and weighting the data from stations in or near each grid square into a representative value for that area. They also use different methods to estimate the value in a grid square that doesn’t enclose a reporting station.

Rather than first adjusting the offset variables for measurements at each site and then computing the global average temperature, as the other groups do, the Berkeley method essentially does those tasks at one time. Robert Rohde of the Berkeley Earth project explains that his group used a standard geostatistics technique known as kriging.

Basically, the Berkeley method is cast as a very large minimization problem. The variable to be minimized is the best estimate of the temperature caused by the local weather, or the deviation of the local temperature from the global average. Its mean should average to zero over long time periods or large spatial scales. One can write the temperature measurement for a given place and time as the sum of four terms: an average global temperature Tavg; the positional variation caused by latitude or elevation; the measurement bias, or offset variable; and the temperature associated with local weather. Turning that equation around gives the local weather term for a given month expressed as the measured temperature for that month minus the global mean Tavg, the positional variation, and the offset variable.

Values for Tavg and for the station offset variables emerge from a global minimization procedure. As Rohde explains, he and his collaborators weighted the monthly weather term from each grid square by a factor related to its correlation with other stations and to the station density. They then summed those monthly terms over all grids and all times and adjusted the offset variables and the monthly means Tavg to minimize the mean square of the local weather term. After each minimization, the values of the offset variables are used to calculate new estimates of Tavg, and the process is iteratively repeated.

Data sets

The Berkeley group drew on 14 previously compiled databases. (Most of those 5 were in the process of being included in a new version 3 , 6 of the Global Historical Climatology Network that the NCDC maintains and uses.) Although many of those databases are available to the public in various forms, Schmidt credits the Berkeley team with pulling them all together and releasing them in a consistent way.

Muller says that his aim throughout has been transparency. His team has put not only the raw temperature data but the computer code used in its analysis on its website. He encourages others to make their own assumptions and do their own calculations.

References

  1. 1. P. D. Jones et al., J. Geophys. Res. 117, D05127 (2012). https://doi.org/10.1029/2011JD017139

  2. 2. J. Hansen et al., Rev. Geophys. 48, RG4004 (2010). https://doi.org/10.1029/2010RG000345

  3. 3. R. S. Vose et al., Bull Amer. Meteorol. Soc. 93, 1677 (2012); https://doi.org/10.1175/BAMS-D-11-00241.1
    J. H. Lawrimore et al., J. Geophys. Res. 116, D19121 (2011). https://doi.org/10.1029/2011JD016187

  4. 4. R. Rohde et al., Geoinfor. Geostat: An Overview 1, 1 (2013); http://www.scitechnol.com/GIGS/GIGS-1-101.pdf .

  5. 5. P. W. Thorne et al., Bull. Amer. Meteorol. Soc. 92, ES40 (2011). https://doi.org/10.1175/2011BAMS3124.1

  6. 6. M. J. Menne et al., J. Atmos. Ocean. Technol. 29, 897 (2012).https://doi.org/10.1175/JTECH-D-11-00103.1

This Content Appeared In
pt-cover_2013_04.jpeg

Volume 66, Number 4

Related content
/
Article
/
Article
/
Article
/
Article
/
Article
Despite the tumultuous history of the near-Earth object’s parent body, water may have been preserved in the asteroid for about a billion years.

Get PT in your inbox

Physics Today - The Week in Physics

The Week in Physics" is likely a reference to the regular updates or summaries of new physics research, such as those found in publications like Physics Today from AIP Publishing or on news aggregators like Phys.org.

Physics Today - Table of Contents
Physics Today - Whitepapers & Webinars
By signing up you agree to allow AIP to send you email newsletters. You further agree to our privacy policy and terms of service.