Global warming ‘hiatus’ never happened, scientists say

A new study reveals that the evidence for a recent pause in the rate of global warming lacks a sound statistical basis. The finding highlights the importance of using appropriate statistical techniques and should improve confidence in climate model projections.

An apparent lull in the recent rate of global warming that has been widely accepted as fact is actually an artifact arising from faulty statistical methods, an interdisciplinary team of Stanford scientists says.

The study, titled “Debunking the climate hiatus” and published online this week in the journal Climatic Change, is a comprehensive assessment of the purported slowdown, or hiatus, of global warming.

“We translated the various scientific claims and assertions that have been made about the hiatus and tested to see whether they stand up to rigorous statistical scrutiny,” said study lead author Bala Rajaratnam, an assistant professor of statistics and of Earth system science.

The finding calls into question the idea that global warming “stalled” or “paused” during the period between 1998 and 2013. Reconciling the hiatus was a major focus of the 2013 climate change assessment by the Intergovernmental Panel on Climate Change (IPCC).

Using a novel statistical framework that was developed specifically for studying geophysical processes such as global temperature fluctuations, Rajaratnam and his team of Stanford collaborators have shown that the hiatus never happened.

“Our results clearly show that, in terms of the statistics of the long-term global temperature data, there never was a hiatus, a pause or a slowdown in global warming,” said Noah Diffenbaugh, a climate scientist in the School of Earth, Energy & Environmental Sciences, and a co-author of the study.

Faulty ocean buoys

The Stanford group’s findings are the latest in a growing series of papers to cast doubt on the existence of a hiatus. Another study, led by Thomas Karl, the director of the National Centers for Environmental Information of the National Oceanic and Atmospheric Administration (NOAA) and published recently in the journal Science, found that many of the ocean buoys used to measure sea surface temperatures during the past couple of decades gave cooler readings than measurements gathered from ships. The NOAA group suggested that by correcting the buoy measurements, the hiatus signal disappears.

While the Stanford group also concluded that there has not been a hiatus, one important distinction of their work is that they did so using both the older, uncorrected temperature measurements as well as the newer, corrected measurements from the NOAA group.

“By using both data sets, nobody can claim that we made up a new statistical technique in order to get a certain result,” said Rajaratnam, who is also an affiliated faculty member at the Stanford Woods Institute for the Environment. “We saw that there was a debate in the scientific community about the global warming hiatus, and we realized that the assumptions of the classical statistical tools being used were not appropriate and thus could not give reliable answers.”

More importantly, the Stanford group’s technique does not rely on strong assumptions to work. “If one makes strong assumptions and they are not correct, the validity of the conclusion is called into question,” Rajaratnam said.

A different approach

Rajaratnam worked with Stanford statistician Joseph Romano and Earth system science graduate student Michael Tsiang to take a fresh look at the hiatus claims. The team methodically examined not only the temperature data but also the statistical tools scientists were using to analyze the data. A look at the latter revealed that many of the statistical techniques climate scientists were employing were developed for other fields such as biology or medicine, and not ideal for studying geophysical processes. “The underlying assumptions of these analyses often weren’t justified,” Rajaratnam said.

For example, many of the classical statistical tools often assume a random distribution of data points, also known as a normal or Gaussian distribution. They also ignore spatial and temporal dependencies that are important when studying temperature, rainfall and other geophysical phenomena that can change daily or monthly, and which often depend on previous measurements. For example, if it is hot today, there’s a higher chance that it will be hot tomorrow as well because a heat wave is already in effect.

Global surface temperatures are similarly linked, and one of the clearest examples of this can be found in the oceans.

“The ocean is very deep and can retain heat for a long time,” said Diffenbaugh, who is also a senior fellow at the Stanford Woods Institute for the Environment. “The temperature that we measure on the surface of the ocean is a reflection not just of what’s happening on the surface at that moment, but also the amount of trapped heat beneath the surface, which has been accumulating for years.”

While designing a framework that would take temporal dependencies into account, the Stanford scientists quickly ran into a problem. Those who argue for a hiatus claim that global surface temperatures either did not increase at all during the 15-year period between 1998 and 2013, or they rose at a much slower rate than in the years before 1998. Statistically, however, this is a hard claim to test because the number of data points for the purported hiatus period is relatively small, and most classical statistical tools require large numbers of data points.

There is a workaround, however. A technique that Romano invented in 1992, called “subsampling,” is useful for discerning whether a variable – be it surface temperature or stock prices – has changed in the short term based on limited amount of data.

“In order to study the hiatus, we took the basic idea of subsampling and then adapted it to cope with the small sample size of the alleged hiatus period,” Romano said. “When we compared the results from our technique with those calculated using classical methods, we found that the statistical confidence obtained using our framework is 100 times stronger than what was reported by the NOAA group.”

The Stanford group’s technique also handled temporal dependency in a more sophisticated way than in past studies. For example, the NOAA study accounted for temporal dependency when calculating sea surface temperature changes, but it did so in a relatively simple way, with one temperature point being affected only by the temperature point directly prior to it.

“In reality, however, the temperature could be influenced by not just the previous data points, but six or 10 points before,” Rajaratnam said.

Pulling marbles out of a jar

To understand how the Stanford group’s subsampling technique differs from the classical techniques that had been used before, imagine placing 50 colored marbles, each one representing a particular year, into a jar. The marbles range from blue to red, signifying different average global surface temperatures.

“If you wanted to determine the likelihood of getting 15 marbles of a certain color pattern, you could repeatedly pull out 15 marbles at a time, plot their average color on a graph, and see where your original marble arrangement falls in that distribution,” Tsiang said. “This approach is analogous to how many climate scientists had previously approached the hiatus problem.”

In contrast, the new strategy that Rajaratnam, Romano and Tsiang invented is akin to stringing the marbles together before placing them into the jar.

“Stringing the marbles together preserves their relationships to one another, and that’s what our subsampling technique does,” Tsiang said. “If you ignore these dependencies, you can alter the strength of your conclusions or even arrive at the opposite conclusion.”

When the team applied their subsampling technique to the temperature data, they found that the rate of increase of global surface temperature did not stall or slow down from 1998 to 2013 in a statistically significant manner. In fact, the rate of change in global surface temperature was not statistically distinguishable between the recent period and other periods earlier in the historical data.

The Stanford scientists say their findings should go a long way toward restoring confidence in the basic science and climate computer models that form the foundation for climate change predictions.

“Global warming is like other noisy systems that fluctuate wildly but still follow a trend,” Diffenbaugh said. “Think of the U.S. stock market: There have been bull markets and bear markets, but overall it has grown a lot over the past century. What is clear from analyzing the long-term data in a rigorous statistical framework is that, even though climate varies from year to year and decade to decade, global temperature has increased in the long term, and the recent period does not stand out as being abnormal.”


Substack subscription form sign up