pause for thought


Greenland temperature variations (green) and mean sea level (blue) over the last 20,000 years.
Greenland temperature variations (green) and mean sea level (blue) over the last 20,000 years.


Pause for Thought

John Reid


Why is it that the global climate appears to have stalled and refused to grow warmer as the “settled science” has predicted? Why are general circulation models (GCMs) so poor at predicting recent global temperature trends? I propose that the problem originates in the assumptions on which the models are based.

Computers have, in general, been such a boon to science that no-one any longer questions the validity of some applications, particularly those numerical models which are based on differential equations. All such models rest on certain assumptions, assumptions which are very rarely questioned or even acknowledged. One assumption is the complete absence of discontinuities: cliffs and fronts and shocks, which are, in reality, widespread in nature.

However, by far the most subtle and far-reaching hidden assumption is that of determinism, the idea put forward in the early nineteenth century by Pierre-Simon Laplace that, should une intelligence know the precise state of the universe at one instant, it can predict the state of the universe at any future time. This idea underlies computer models based on differential equations.

Physicists have understood the underlying stochastic (i.e. non-deterministic, random) propensities of nature for more than a century. To a physicist, deterministic, numerical models of natural processes may have their uses, but they are known to be limited in scope; meteorological models cannot predict beyond about a week ahead.

On the other hand stochastic models (i.e. models which contain some random elements) are usually frequency-domain models and are much more powerful. If the theory doesn’t fit the data, then the theory is wrong; there is no room for special pleading. Stochastic models frequently involve an examination of the distribution of energy or variance with frequency known as a “power spectrum”. It was this sort of modeling which led to the invention of quantum mechanics in the last decade of the nineteenth century, one of the great triumphs of modern physics.

Climate science is dominated by time-domain, deterministic modeling – GCMs. In this field applied mathematicians and computer programmers have replaced physicists. A deterministic modeler looks at the graph of global average temperature for the last century and sees that it is increasing. This small change in temperature must have a cause because, if you see the world deterministically, every effect must have a cause. A good candidate is the increasing level of atmospheric CO2 due to modern industry. In the laboratory CO2 absorbs radiation, so something similar happens on a global scale. The modeler ignores the simple physical facts that total man-made production of CO2 since the beginning of the Industrial Revolution only accounts for one or two percent of the total CO2 in the ocean-atmosphere system and that convection completely dominates radiation in the transport of heat through the lower atmosphere. Never mind, they tell themselves, we can always plug in enough feedbacks and fudge factors to make our model work.

At least in the short term.

A stochastic modeler (e.g. a physicist) looks at the same data and sees quasi-cyclic random fluctuations superimposed on a rising trend. It looks like red noise, which means that random variations are bigger at longer time scales than at shorter time scales. Both the rising trend in recent global average temperature and atmospheric CO2 concentration are, most likely, the outcome of noise components which are longer than the record length. Examination of much longer records of proxy temperature data from ice-cores shows that this is indeed the case (Pelletier, 2002). The proxy data does indeed have a red spectrum, and the recent, measured temperatures are typical of what you get when you take a short sample from such a red noise time series. If you see the world stochastically, then there is nothing unusual about the climate of the twentieth century.

The stochastic modeler then takes a longer look at the ice core time series over the last half million years or so. It is very interesting. There have indeed been large swings in climate. The last one ended 11,000 years ago. Climate at this longer time scale looks very much like a particular type of red spectrum known as a “random walk”. (A random walk is the sum that you get if you throw a coin over and over again and add one for heads and subtract one for tails after each throw.) There is a big difference though. A random walk tends to wander further and further away from zero (variance increases with time) but the temperature throughout the succession of ice ages and interglacials remains within a narrow channel (between about -18 and +10 deg C). It is a “bounded random walk”.

Why should it be bounded?

Simple physics tells us that, even in the complete absence of greenhouse gases, the planet cannot get any colder than the Ice Age temperature of -18 C because, at that temperature, the earth’s surface radiates the same amount of heat that it receives from the sun. This is a consequence of the Stefan-Boltzman Law and it accounts for the lower boundary.

It is an observed fact that the surface temperature of the sea under natural conditions in the tropics rarely rises above 28 deg C. Any extra heat causes no increase in temperature. Instead, adding heat causes more rapid evaporation, followed by more vigorous turbulent convection (a stochastic process) which carries the extra heat to the top of the atmosphere where it radiates into space. This accounts for the upper boundary.

The stochastic modeler’s theory of climate as a bounded random walk is physically reasonable.

On the other hand, a deterministic modeler (e.g. palaeoclimatologist, Richard Ally, in his YouTube video) looking at the same Ice Age temperature time series, sees that there have been large, rapid fluctuations which he cannot explain because he sees things deterministically. His response? Climate is obviously highly unstable, we don’t understand why and we should proceed with great caution.

In the climate community, there is a growing awareness of the need for stochastic models (see, for example, Judith Curry’s blog, Climate Etc). For this reason deterministic climate models are often run many, many times with slightly differing starting conditions. The resulting set of model outputs is referred to as a model “ensemble” after a similar concept in statistical theory. However generating an ensemble of predictions from a variety of starting points does not in itself make a deterministic model into a stochastic model and statistical deductions based on such ensembles are highly dubious. Neither has there been any attempt to rigorously test such models in the frequency domain.

For a model to be truly stochastic, the various parameterisations within the model need to be randomised as well as the initial conditions. These include all those forcings and processes which have a stochastic character, e.g. turbulent convection, cloud formation, wind stress and so on. The problem is that if you do this the various model runs rapidly diverge from one another and model predictions become obviously worthless. Furthermore some forcings such as sea floor heating by volcanic activity are assumed to be evenly distributed over the entire ocean floor and to be steady state. Volcanoes on land are sparse and intermittent and it seems that volcanoes are also sparse and intermittent under the ocean (e.g. White et al, 2003).

They therefore constitute a major stochastic forcing which exceeds both tidal friction and wind stress in magnitude but one which is completely ignored by climate modelers.

And the present pause? To a stochastic modeler it comes as no surprise. It could have been predicted 20 years ago on a desktop computer using a simple autoregressive (AR) model. However, such prosaic findings are rarely funded or published.


Pelletier, J.D. (2002) “Natural variability of atmospheric temperatures and geomagnetic intensity over a wide range of time scales”. PNAS, 99, supp. 1, pp 2546-2553.

White, J. D. L., J. L. Smellie and D. A. Clague (2003), Introduction: A Deductive Outline and Topical Overview of Subaqueous Explosive Volcanism. Explosive Subaqueous Volcanism, Geophysical Monograph 140, American Geophysical Union, 10.1029/140GM01

A version of this article appeared in Quadrant Online on 20 October 2014.

3 thoughts on “pause for thought”

  1. From a statistical point of view, the core claim of the Warmists is, itself, nonsense. There has been no significant warming. There was nothing unusual about the climate of the twentieth century. If anything it was rather benign. The whole thing is a furphy. When people talk of the “warmest decade since records began”, they are only talking about 150 years or less. There is ample evidence from ice-cores and elsewhere that variations in climate – rainfall, average temperatures and so on – vary on every time scale and that the longer the time period the larger the variation (out to 40,000 years).
    Hence “since records began” refers to a quiet little interlude in this huge climate drama.

  2. Thanks, Fang, for the pareidolia word. I had not heard it before but it describes a widespread delusion in the geosciences colloquially known as “wiggle matching”.
    I attempt to address this in my Bounded Random Walk page but I think it needs more work. People still don’t get it.

  3. Very informative work. A challenge I recently set for myself as a New Year’s resolution, to journalize the Pelletier (2002) reference, so doing my bit to help save the world from its present day Climate Change pareidolia (the human tendency to read significance into random or vague stimuli). Pareidolia seems to be worst for the high amplitude, LONG PERIOD, random variations so characteristic of climate variations.
    Ironically, one of the first branches of science to be created in the new humanistic-discipline-led Department Of Energy under President Carter, Climate Science, was one of the least suitable because it is the most vulnerable to pareidolia. As a result it is now run by people insufficiently trained in the underlying stochastic propensities of nature.

Comments are closed.