bounded random walk

Figure1

Figure 1: Smoothed linear variance density spectra of the Ice Age proxy temperature time series shown at Figure 2(a). The frequencies of eccentricity, obliquity and precession of the earth in its orbit are labeled “e”, “o” and “p” respectively.

The 100,000 year glacial cycle is a bounded random walk

John Reid

 

Abstract

The 100,000 year (100 kyr) peak in the variance spectra of late Pleistocene time series has long been regarded as problematic because there is very little variation in high latitude insolation near this frequency. A coin-tossing numerical experiment and log-log spectra are used to demonstrate that this peak is the outcome of a stochastic, bounded, random walk process rather than due to cyclical astronomical forcing.

1. Introduction

Milankovic (1941) was first to propose that the cycling of climate apparent in the geological record and known as “the Ice Ages” could be due to secular variations in the earth’s orbit around the sun. These variations were thought to give rise, in turn, to variations in solar insolation at latitudes north of 65° N and its effect on the growth and collapse of the Northern Hemisphere ice sheet . The pioneering work of Hays et al (1976) identified clear peaks in the ice-core record and related them to the eccentricity, obliquity and precession of the orbit. The “eccentricity peak” near 100 kyr dominated their spectra.

However, more recently, a number of workers have questioned the astronomical origin of the 100 kyr peak, because variations in orbital eccentricity have little effect on insolation, e.g. Muller and MacDonald (1997), Ridgwell et al (1999), Lisiecki(2010).

I propose that this spectral peak is not evidence of any underlying cyclical phenomenon, but rather, the outcome of a random walk combined with a bounding or clipping process which has the effect of attenuating variance density at low frequencies. The 100 kyr peak is due to high-pass-filtered red noise.

2. Estimation and Display of Variance Density Spectra

Spectral analysis is concerned with the computation of “power spectra”, or, more properly, the estimation of variance density spectra. Variance is the numerical analogue of power or energy in physics and spectra are graphs which display how this energy or variance is distributed with frequency, wave number or wavelength. It depends on the Power Theorem of Fourier Analysis which tells us that variance density integrated with respect to frequency is equal to variance integrated over time. It provides a useful tool for assessing whether periodic processes are present in noisy data.

The widespread use of numerical variance spectra became possible with the advent of computers as a research tool in the 1960s. At this time the seminal works on this topic were Blackman and Tukey (1958) and Cooley and Tukey’s (1962) ingenious Fast Fourier Transform algorithm. This methodology was used by Hayes et al.

While Blackman and Tukey did, to some extent, address the statistical nature of spectral estimation, their work was concerned primarily with communications engineering and was not intended as a tool for statistical inference. As a consequence their windowing technique is inappropriate in a statistical context. They presumed that the “periodogram”, the un-windowed, unfiltered squared modulus of the Fourier Transform, is not a consistent estimator of the population spectrum, i.e. that its variance does not tend to zero with increasing sample size leading to an unacceptably noisy spectral estimate. Thus windowing in the time domain in order to decrease noise became an essential feature of practical spectral analysis.

This was incorrect. At each frequency both the real and imaginary parts of each complex value the Fourier transform of a random sequence can be shown to be independent and Gaussian. It follows that their sum of squares has a chi-squared distribution with 2 degrees of freedom and that this is independent of L, the length of the sequence. If N periodograms of independent sequences from the same process are added, each ordinate will have a chi-squared distribution with 2N degrees of freedom. The variance of such a chi-squared distribution is 4N. Dividing by N to extract the mean leads to a spectral estimate for which each ordinate has variance 4N/N2 = 4/N. This tends to zero as N tends to infinity and so, by definition, the periodogram is indeed a consistent estimator. Blackman and Tukey’s mistake was to confuse L, the length of the sequence, with N, the number of spectra being averaged. Attempts to circumvent this non-existent problem have detracted from the viability of the periodogram as a statistical tool.

Indeed, windowing became something of a shibboleth among researchers using spectral analysis. Ever more sophisticated techniques such as the multi-taper method (Thomson, 1982) came into vogue. Such techniques may have their uses in communications engineering but are unnecessary in most statistical applications. The difference is that in the former field, unlimited test data are readily available whereas in the latter case, data sets are frequently limited and expensive so that the researcher needs to glean the maximum amount of information from whatever data are available. Windowing the data in the time domain or its equivalent, convolution in the frequency domain, does not help. because frequency resolution is always sacrificed for amplitude resolution and vice versa. Both cannot be known precisely in a manner which resembles the Heisenberg principle.

Here we use the “periodogram”, the un-windowed, unfiltered squared modulus of the Fourier Transform, which is a consistent estimator of the population spectrum, albeit a rather noisy one.  So et al (1999) have demonstrated numerically that the unfiltered periodogram is the most efficient method of identifying cyclical signals embedded in random noise.

Both the frequency and the variance estimate on logarithmic scales. This has the following advantages:

  • the spectrum looks much less noisy,
  • any power law relationship appears as a straight line and so can be easily recognized and its index estimated,
  • the relationship between the stochastic (random) and deterministic (cyclical) components of a time series can be more clearly seen,
  • estimated variance density and both its upper and lower confidence limits are more easily displayed over a wider range of values,
  • significant troughs can be detected in addition to significant peaks and
  • false or insignificant peaks and troughs are less likely to be mistaken for the real thing.

3. Application to Ice Core Time Series

The spectrum of 800 kyr of proxy ice-core temperatures is shown in Figure 1. The spectrum closely resembles the spectra shown in Hays et al (1976) and was computed in a similar way. Both frequency and variance density were plotted on linear scales and the periodogram was convoluted with a running mean 3-wide boxcar filter to make it less noisy. The classical three spectral peaks attributed to eccentricity (“e”, f = .01 kyr-1), obliquity (“o”, f = .0244 kyr-1) and precession (“p”, f =.043 kyr-1) can be clearly seen.

Figure2

Figure 2: (a) EPICA ice core proxy temperature time series for the last 800,000 years due to Jouzel et al (2007). Times of Ice Age Terminations are labeled from 1 to 5. (b) Cumulative sum of the number of heads, H, minus the number of tails, T, from a computer simulation of 1000 throws of a coin. (c) Cumulative sum of Heads, H, minus tails, T, when the series H-T is bounded to lie in the range (-4,4).

The time series from which the spectrum of Figure 1 was calculated came from the EPICA Dome C Ice Core 800 kyr Deuterium Data and Temperature Estimates (Jouzel et al, 2007), which were downloaded from the World Data Center for Paleoclimatology website. The Temperature Estimate column was extracted and all the values in each 780 year interval were averaged to give a single value. In this way a 1024-long time series of equally spaced values was generated. It is shown in Figure 2(a). The 5 most recent Ice Age Terminations, Terminations I, II, III, IV and V are labeled “1” through “5” in the figure.

Note that there are no cycles with periods near 100 kyr evident in the time series itself. The last five Terminations have an average period close to 100 kyr but closer examination shows that they are in fact either about 120 kyr apart (I and II, II and III) or about 80 kyr apart ( III and IV, and IV and V). Huybers and Wunsch (2005) have shown that the ice sheets terminate every second or third obliquity cycle.

Figure3

Figure 3. Unfiltered periodogram spectral estimate of the 3 time series shown in Figure 2 plotted using logarithmic scales on both axes. The spectra have been displaced in the vertical direction by arbitrary amounts for clarity. The dashed lines show the locus of the -2 power law which best fits each spectrum. Each grey band shows the chi-squared 95 percent confidence limits about this locus. The labels “e”, “o” and “p” are the frequencies of eccentricity, obliquity and precession of the earth in its orbit.

Figure 3(a) shows the spectral density estimate derived from the unfiltered periodogram and plotted using logarithmic scales (upper graph). Also shown as the gray band are the upper and lower 95 percent confidence limits. The null hypothesis is that ordinates are statistically independent, have a chi-squared distribution with 2 degrees of freedom and that the population spectrum is a power law spectrum with an index of -2.

Note that, in Figure 3(a):

  • the peak at the eccentricity frequency, “e”, is barely significant,
    there are significant peaks at both the obliquity and precession frequencies,
  • the variance density, log(S) is related linearly to log(f), and the slope of the line is close to -2 and
  • there is a significant trough below .01 kyr-1.

4. The Implications

Power law relationships are of prime importance in physics because of what they reveal of underlying physical processes, for example, in fields such as Radio Astronomy, Cosmic Rays and Surface Gravity Waves. Here we have a relationship summarized by

log( S ) = νlog( f ) + const

i.e.

S = A f ν

where A is a constant and the other symbols represent population values. Thus there is good evidence of a power law relationship between variance density and frequency, where the exponent, ν = -2.

Such an inverse square power law is indicative of an AR(1) stochastic process or “random walk”. If an unbiased coin is repeatedly tossed and the number of heads minus the number of tails accumulated after each throw, then this would constitute a random walk process. Such a process can be readily simulated in a few minutes by means of a “coin-tossing” computer program whereby the Rnd function, which generates a pseudo-random number between 0 and 1 can be used to decide whether a particular “toss” is a “head” (Rnd > 0.5) or a “tail” (Rnd < 0.5). 1000 numbers were generated in this way then processed and displayed in a similar manner to the ice-core time series under discussion.

The resulting time series is shown in Figure 2(b) and its spectrum is shown in Figure 3(b). As expected the sequence exhibits a similar inverse square power law spectrum to the ice-core time series under consideration. There are important differences however: there are no significant peaks like the obliquity and precession peaks and there is no trough at low frequencies. Such a low-frequency trough in the proxy temperature time series appears to imply a relationship between values which are widely separated in time, i.e. more than 100 kyr apart. This is puzzling when one considers that the power law nature of the rest of the spectrum implies that values more than about 1 kyr apart are uncorrelated.

However this argument assumes that everything is linear. There is another mechanism whereby a low-frequency trough can be generated and this involves a non-linear process. When the time series of Figure 2(a) and Figure 2(b) are compared, the temperature time series of Figure 2(a) appears to be constrained to lie within a particular range whereas the time series of Figure 2(b) wanders far from its initial value in a manner typical of a random walk. This suggests that the global temperature for which the temperatures in Figure 2(a) are a proxy, while fundamentally random, is nonetheless constrained to lie within certain limits; it appears to be bounded above and below.

The coin tossing program was therefore modified to be bounded above and below; if the cumulative total exceeded 4 it was allocated the value 4 and if it fell below -4 it was allocated the value -4, i.e. the random walk process was bounded to keep it within the specified range (-4,4).

The resulting time series is shown in Figure 2(c) and its spectrum in Figure 3(c). Note that:

  • the time series, 2(c), now looks considerably more like the time series of Figure 2(a) and
  • the spectrum of Figure 3(c), is similar in shape to that of Figure 3(a) in having a significant trough at low frequencies.

Finally the bounded random walk spectrum of Figure 3(c) was plotted with linear scales and subjected to the same frequency domain filtering that was used for the spectrum in Figure 1. The result is shown in Figure 4. The Figure 4 spectrum shows a spectral peak which is very similar in appearance to the 100 kyr peak of Figure 1.

Figure4

Figure 4. The smoothed variance density spectrum of the time series shown at Figure 2(c) displayed using linear scales.

5. Conclusions

The assumptions underlying the numerical experiments described above are not unreasonable in the context of climate. The measured quantity, temperature, T, is related to various physical processes by

T α ∫q dt

where q represents the transfer of some energetic quantity such as heat. If q is unselfcorrelated over time (i.e. white) then T will have a power law spectrum with an index of -2.. This is particularly true of climate processes, see for example Hasselmann (1976). In addition the idea that such a random walk process will be bounded is not unexpected. Given a constant influx of radiation from the sun, at large scales the average temperature of the surface of the earth cannot fall below about -18°C by Stefan’s Law of Radiation. At the other extreme sea surface temperatures rarely exceed 28°C because high evaporation rate from the sea surface at high temperatures gives rise to tropical storms which rapidly convect heat to the tropopause whence it is radiated into space. Bounding may also be a consequence of limits to ice sheet extension and greenhouse gas concentration.

The presence of a peak in the linearly displayed variance density of Figure 4 might lead us to conclude that there is a strong cyclic component in the time series whereas, in fact, it is known that this time series was created by a purely random process similar to tossing a coin. The peak is entirely the outcome of a bounded, random walk process. It seems likely then, that the 100 kyr peak in the proxy temperature record is also the outcome of a bounded, random walk process, a process unrelated to astronomical forcing.

Thus the climate of the late Pleistocene can be conceived as a random walk between two temperature boundaries. The upper temperature boundary is characterized by small ice-caps, saturated CO2 and H2O absorption bands with a convection transport atmosphere while the lower temperature boundary is characterized by large ice-caps, low atmospheric CO2 and H2O and a radiation transport atmosphere. Climate oscillates between these two metastable states in a random manner which includes some phase-locking to the obliquity cycle of the earth’s orbit.

6. References:

Blackman, R.B. and J.W. Tukey (1958). The Measurement of Power Spectra from the Point of View of Communication Engineering. Dover, New York.

Cooley, J. W. and J.W. Tukey  (1965). An algorithm for the machine calculation of complex Fourier series,  Math. Comput. 19 , 297–301.

Hasselmann, K. (1976). Stochastic climate models, Part I, Theory. Tellus XXVIII, 6.

Hays, D.J., J. Imbrie and N.J. Shackleton (1976). Variation in the Earth’s Orbit: Pacemaker of the Ice Ages. Science, 194, 4270, pp 1121-1132.

Huybers, P. and C. Wunsch (2005). Obliquity pacing of the late Pleistocene glacial terminations. Nature 434, 491-494.

Jouzel, J., Masson-Delmotte V., Cattani, O., Dreyfus, G., Falourd, S., Hoffmann, G.,B. Minster, B., Nouet,J., Barnola, J.M., Chappellaz, J., Fischer, H., Gallet, J.C., Johnsen, S., Leuenberger, M., Loulergue, L., Luethi, D., Oerter, H., Parrenin, H., Raisbeck, G., Raynaud, D., Schilt, A., Schwander, J., Selmo, E., Souchez, R., Spahni, R., Stauffer, B., Steffensen, J.P., Stenni, B., Stocker, T.F., Tison, J.L., Werner, M., Wolff, E.W., (2007). Orbital and Millennial Antarctic Climate Variability over the Past 800,000 Years. Science, 317, 5839, pp.793-797. Web site (as accessed 13 September 2013) (http://www.ncdc.noaa.gov/paleo/icecore/antarctica/domec/domec_epica_data.html)

Lisiecki, L.E. (2010). Links between eccentricity forcing and the 100,000-year glacial cycle. Nature Geoscience 3, 349 – 352.

Milankovic, M. (1941). Kanon der Erdbestrahlung und seine Anwendung auf das Eiszeitenproblem. Koniglich Serbische Akademie, Belgrade.

Muller, R.A., and G.J. MacDonald (1997). Spectrum of 100-kyr glacial cycle: Orbital inclination, not eccentricity. Proc. National Acad. Sciences USA. 94, 8329-8334.

Ridgwell, A.J.,  A.J. Watson and M.E. Raymo (1999). Is the spectral signature of the 100 kyr glacial cycle consistent with a Milankovitch origin? Paleoceanography, 14, 437-440.

So, H.C. , Y.T. Chan, Q. Ma and P.C. Ching (1999). Comparison of Various Periodograms for Sinusoid Detection and Frequency Estimation. IEEE Trans. Aerospace Electronic Systems. 35, 945-951.

Addendum

Since I wrote the above I came across a paper by Jon Pelletier of the University of Arizona which shows the temperature spectrum over a wider range of time scales, viz.:

Fig01

The whitening at low frequencies can be clearly seen.

The abstract states:

We show that a simple stochastic model, which treats the flux of heat energy in the atmosphere by convective instabilities with random advection and diffusive mixing, does a remarkable job at matching the observed power spectrum of historical and proxy records for atmospheric temperatures from time scales of one day to one million years (Myr).

Reference

Pelletier, J.D. (2002) “Natural variability of atmospheric temperatures and geomagnetic intensity over a wide range of time scales”. PNAS, 99, supp. 1, pp 2546-2553.

7 thoughts on “bounded random walk”

  1. The objectives of signal processing and the objectives of statistical inference are quite different even though they share some methodology. The signal processing engineer has grown up tuning IF strips (well, in my day), designing Tschebychev and Butterworth filters and so on, all to the end of making a nice square boxcar passband so as to pass an undistorted signal. On the other hand the statistician is seeking to demonstrate the significance or otherwise of a cyclical / deterministic signal in the presence of noise. Hence he doesn’t give a hoot about squareness or distortion and is solely interested in getting the maximum information out of a very small amount of data. To him a boxcar passband is a disadvantage because it precludes the assumption of independent Gaussian ordinates and the resulting 2 dof chi-squared distribution from which he gets his confidence limits.

    Also, any finite data sample always has some sort of blur function in the frequency-time plane representing an uncertainty similar to the Heisenberg principle – you cannot know both frequency and time with perfect precision. Filtering or windowing do not help, they merely change the shape of the blur function but they cannot make it smaller. Arcane methodologies such as the Thompson multi-taper method are completely unnecessary in (statistical) time series analysis. The unfiltered periodogram is as good as gets.

    Your last question is spot-on. I haven’t sorted the theory yet but from a practical point of view, remember we are only looking at a single realization, one member of an ensemble. The spurious peak(s) can be anywhere in or near the low frequency plateau or there may be no peak at all. I chose a 1024-coin-tosses experiment which gave a peak most like that in Figure 1 (it took about 5 runs). Do the numerical experiment yourself and have a look. I can send you the code if you like (its in Visual Basic).

  2. Just a word from a signal processing engineer in defence of Blackman & Tookey. Performing a DFT on a finite set of temporal data without applying a windowing function impresses on each frequency point a sin(x)/x modulation where x is the width of the data set. As you will know, using a windowing function reduces these sidebands but at the cost of widening the original data point. It’s always worth looking at both the windowed and unwindowed data, though. I presume that in your application the size of the data set means that this phenomenon is probably negligible.
    Can the relationship between the frequency of the pseudo-cyclic components and the function governing the bounds be determined ?

  3. Hello AC Osborne. Thank you for making the very first comments.

    You are most likely correct about solar variability – other G-type stars are variable. However the sun’s variability or otherwise is not relevant to the point I am making in this article. I am pointing out that apparently cyclic behaviour such as the “ellipticity peak” can be the outcome of a purely random process such as tossing a coin. This is particularly true of climate cycles which have pink or red spectra at periods shorter than 40,000 years. I am also suggesting that rather than “tipping points”, the Earth’s climate in fact has “boundary points”; i.e. that the Earth is not like Venus as Jim Hansen would have us believe. I only assumed a constant input from the Sun in order to make this argument.

      1. Thanks for the link. What a great blog! Yes, the Scaffetta paper looks very convincing. I shall go through it in more detail when I have time.

        People don’t seem to be quite getting the points that I am making here (my fault for not explaining it better) and they are:
        1. It is possible to generate a spectrum with a strong peak suggestive of a cyclical component, by a purely random process in which no cyclic component is actually present, and
        2. The ice ages look like a bounded random walk with some real cyclical components.

        Perhaps a better way to have stated #2 would be that the sequence of residuals after the removal of the cyclic components is a bounded random walk.

        Have a look at Pause for Thought – it is the belief that every natural phenomenon can be described in deterministic/cyclic terms that I am opposed to. I am not saying there are no cyclic forcings.

        Perhaps I should rewrite this page to make this clearer. Thank you for being a such a good de facto referee.

  4. I have come to this forum via Euan’s Energy Matters, welome to the Web and may you have many visitors.

    I am not sure about your findings due to one major assumption that you agree with and that is that “Total Insolation” does not vary very much and that said total insolation is the key to climate changing.
    First of all I have had discussions with Leif Svalgaard about the history of the Sun and find his explanations full of hubris, I just do not believe anyone can say with his absolute surety of how the Sun was acting 10,000 years ago let alone 10M or 5B years ago.
    The other point about changes in Solar radiation is that although the TSI does not appear to vary very much it’s Radiation Frequencies vary considerably.

    1. PS, for Solar Scientists that are so sure of the past, they sure don’t seem to have enough understanding of Solar Mechanics to predict the future, especially the currrent solar cycle.

Comments are closed.