# Decay of correlations

 Nikolai Chernov (2008), Scholarpedia, 3(4):4862. doi:10.4249/scholarpedia.4862 revision #91188 [link to/cite this article]
Post-publication activity

Curator: Nikolai Chernov

Decay of correlations is a property of chaotic dynamical systems. This property makes deterministic systems behave as stochastic or random in many ways.

## Background

### Measure preserving transformations

A deterministic dynamical system with discrete time is a transformation $$f \colon X \to X$$ of its phase space (or state space) $$X$$ into itself. Every point $$x \in X$$ represents a possible state of the system. If the system is in state $$x\ ,$$ then it will be in state $$f(x)$$ in the next moment of time.

Given the current state $$x=x_0 \in X\ ,$$ the sequence of states $x_1 = f(x_0),\ x_2 = f(x_1),\ \ldots\ ,\ x_n = f(x_{n-1}),\ \ldots$ represents the entire future; note that $$x_n = f^n(x_0)$$ is the state at time $$n\ .$$ If the map $$f$$ is invertible, then the past states $$x_{-n} = f^{-n}(x_0)$$ can be determined as well.

It is common to assume that the map $$f$$ preserves a probability measure, $$m\ ,$$ on $$X\ ;$$ this precisely means that for any measurable subset $$A \subset X$$ one has $$m(A) = m(f^{-1} (A))\ ,$$ where $$f^{-1}(A)$$ denotes the set of points mapped into $$A\ .$$ The invariant measure $$m$$ describes the distribution of the sequence $$\{x_n\}$$ for any typical initial state $$x_0\ .$$

### Observables

In applications the actual states $$x_n \in X$$ are often not observable. Instead, one usually observes a real-valued function $$F$$ on $$X\ ,$$ it is called an observable. At time $$n$$ one observes the value $$F(x_n)\ .$$ Thus, instead of dealing with the sequence of states $$\{x_n\}$$ one `sees' a sequence of observed values of that function, $$F_n = F(x_n)\ .$$

We can regard the function $$F$$ on $$S$$ as a random variable (with respect to the probability measure $$m$$); for each $$n$$ the function $$F_n = F \circ f^n$$ is a random variable, too. Thus one observes a sequence of random variables, $$\{F_n\}\ .$$

An important fact is that the sequence $$\{F_n\}$$ is a stationary stochastic process (with discrete time). Its stationarity follows from the invariance of $$m\ .$$ It is usually assumed that the observable $$F$$ is square integrable, i.e. $$m(F^2)<\infty\ .$$ Thus our random variables $$F_n$$ have finite mean value $\mu_F = m(F)=\int_X F\, dm$ and variance $\tag{1} \sigma_F^2 = m(F^2)-[m(F)]^2.$

### Strong law of large numbers

The classical Birkhoff ergodic theorem claims that for $$m$$-almost every initial state $$x_0 \in X$$ the time averages converge to the space average, i.e. $\frac{F_0 + F_1 + \cdots + F_{n-1}}{n} \rightarrow \mu_F = m(F) \qquad\text{as}\ n\to\infty.$ In probability theory this property is known as Strong Law of Large Numbers (SLLN).

In terms of the partial sums of the observed sequence $$F_n$$ $S_n = F_0 + F_1 + \cdots + F_{n-1}$ the Birkhoff ergodic theorem can be stated as $\frac{S_n-n\mu_F}{n} \to 0, \qquad\text{i.e.}\qquad S_n = n\mu_F + o(n).$ In many cases the remainder term $$o(n)$$ is actually $$O(\sqrt{n})\ ,$$ and this is where correlations come into play.

## Correlations

### Definition

Next consider covariances $\tag{2} C_F(n) = m(F_0F_n) -\mu_F^2 = m(F_kF_{n+k}) - \mu_F^2\qquad \text{(for any}\ k\text{)}.$

If we a priori normalize the $$F_n$$'s to ensure that $$\sigma_F^2=1\ ,$$ then the $$C_F(n)$$ becomes the correlation coefficient between random variables $$F_k$$ and $$F_{n+k}\ ,$$ i.e. between values observed at times that are $$n$$ (time units) apart. If the system behaves chaotically, then for large $$n$$ those values should be nearly independent, i.e. the correlations should decrease (decay) as $$n$$ grows. In the studies of dynamical systems, physics, and other sciences, it is common to slightly abuse terminology and call the $$C_F(n)$$ 's correlations even without normalization assumption $$\sigma_F^2=1\ .$$

More generally, for any two square-integrable observables $$F$$ and $$G$$ the correlations are defined by $C_{F,G}(n) = m(F_0G_n) -\mu_F\,\mu_G = m(F_kG_{n+k}) - \mu_F\,\mu_G\qquad \text{(for any}\ k\text{)}.$ Accordingly, (2) are called autocorrelations.

### Correlations and mixing

The transformation $$f \colon X \to X$$ is said to be mixing if for any two measurable sets $$A,B \subset X$$ one has $m(A\cap f^{-n}(B)) \to m(A)\, m(B) \quad\text{as}\ n\to\infty.$ The mixing property is related to correlations: precisely, $$f$$ is mixing if and only if correlations decay, i.e. $C_{F,G}(n) \to 0 \quad\text{as}\quad n \to \infty,$ for every pair of square integrable functions $$F$$ and $$G\ .$$ The speed (or rate) of the decay of correlations (also called the rate of mixing) is crucial when one deals with particular observables.

### Correlations and SLLN

The first question where the decay of correlations comes into play is how fast the time averages $$\tfrac{1}{n} S_n$$ converge to the space average $$\mu_F$$ (the convergence is guaranteed by the Birkhoff ergodic theorem).

To determine the order of magnitude of the difference $$S_n-n\mu_F$$ one can estimate its root-mean-square value $$\sqrt{ m([S_n-n\mu_F]^2)}\ .$$ Simple algebra gives $m([S_n-n\mu_F]^2) = nC_F(0) + 2(n-1)C_F(1) +2(n-2)C_F(2)+\cdots+2C_F(n-1).$ Suppose the correlations decay fast enough so that (at least) $\tag{3} \sum_{n=0}^{\infty} |C_F(n)| < \infty.$

Then the following sum is always non-negative: $\sigma^2 = \sum_{n=-\infty}^{\infty} C_F(n) =C_F(0) + 2 \sum_{n=1}^{\infty} C_F(n),$ and for generic observables $$F$$ it is positive. Note that this $$\sigma^2$$ is different from $$\sigma_F^2$$ in (1); while $$\sigma_F^2$$ characterizes one random variable $$F\ ,$$ this $$\sigma^2$$ characterizes the entire process $$\{F_n\}\ .$$

Under the assumption (3) the mean square of $$S_n-n\mu_F$$ grows as $m([S_n-n\mu_F]^2) = n \sigma^2 + o(n).$ This means that typical values of $$S_n-n\mu_F$$ are of order $$\sqrt{n}\ ;$$ on average they grow as $$\sigma \sqrt{n}\ .$$ One can write $S_n = n\mu_F + O(\sqrt{n}).$

### Correlations and Central Limit Theorem

The above fact leads to an adaptation of the probabilistic central limit theorem (CLT) to chaotic dynamical systems. One says that $$F$$ satisfies the CLT if the sequence $$(S_n-n\mu_F)/\sqrt{n}$$ converges in distribution to normal law $$N(0,\sigma^2)\ .$$ That is, for every real $$z \in (-\infty, \infty)$$ $m\Bigg(\frac{S_n-n\mu_F}{\sqrt{n}} < z\Bigg) \to \frac{1}{\sqrt{2\pi\sigma^2}} \int_{-\infty}^z \exp\Bigg(-\frac{t^2}{2\sigma^2}\Bigg)\, dt \qquad\text{as}\ n\to\infty.$ Usually, the central limit theorem holds whenever the correlations $$C_F(n)$$ decay fast enough; the asymptotics $$|C_F(n)| = O(n^{-(2+\varepsilon)})$$ for some $$\varepsilon>0$$ is often sufficient.

## General issues

### Factors affecting the decay of correlations

The rate of the decay of correlations, i.e. the speed of convergence $$C_{F,G}(n) \to 0\ ,$$ depends on two factors:

• the strength of chaos in the underlying dynamical system $$f \colon X \to X\ ;$$
• the regularity of the observables $$F$$ and $$G\ .$$

Generally, the correlations decay rapidly if both conditions hold:

• the system is strongly chaotic and
• the observables are sufficiently regular.

Standard examples of strongly chaotic systems are

• the angle doubling map $$f(x) = 2x$$ (mod 1) of a circle, which is usually identified with the unit interval $$X=[0,1)$$
• Arnold's cat map $$(x,y) \mapsto (2x+y,x+y)$$ (mod 1) of the unit torus.

In both examples, correlations decay exponentially fast, i.e. $$|C_{F,G}(n)| = O(e^{-an})$$ for some $$a>0\ ,$$ and Central Limit Theorem holds, whenever the observables $$F$$ and $$G$$ are Holder continuous. However for less regular (say, just continuous) observables, correlations may decay arbitrarily slowly and Central Limit Theorem may fail.

In dynamical systems where chaos is weak (for example, where "traps" exist in the phase space), correlations often decay more slowly, i.e. subexponentially. In such cases correlations often decay polynomially, i.e. $$|C_{F,G}(n)| = O(n^{-b})$$ for some $$b>0\ ,$$ whose value then reflects the degree of chaos in the system.

### Applications

The decay of correlations plays a crucial role in nonequilibrium statistical mechanics. It is essential in the studies of relaxation to equilibrium. The autocorrelation function $$C_F(n)$$ is explicitly involved in the formulas for transport coefficients, such as heat conductivity, electrical resistance, viscosity, and the diffusion coefficient.

## Systems with continuous time

The above theory easily extends to dynamical systems with (perhaps, physically more realistic) continuous time. We only indicate its main elements.

Let $$\Phi^t \colon X\to X$$ be a one-parameter family (a flow) of transformations on the phase space $$X$$ that preserve a probability measure $$m\ .$$ Let again $$F$$ denote an observable. Then the $$F_t=F\circ\Phi^t$$ is a stationary stochastic process with continuous time $$t\ .$$ Instead of partial sums $$S_n$$ one considers time integrals $S_T = \int_{0}^T F_t\, dt= \int_{0}^T F\circ\Phi^t\, dt.$ The Birkhoff ergodic theorem claims that $$S_T/T \to \mu_F$$ as $$T \to \infty$$ for almost every initial state.

The correlation function is defined by $C_{F,G}(t)= m(F_0G_t) -\mu_F\,\mu_G= m(F_sG_{s+t})-\mu_F\,\mu_G \qquad \text{(for any}\ s\text{)}.$ Note that now it not a sequence but a function of a real argument.

The flow $$\Phi^t$$ is mixing if and only if correlations decay, i.e. $C_{F,G}(t) \to 0 \quad\text{as}\quad t \to \infty$ for every pair of square integrable function $$F$$ and $$G\ .$$ Suppose the correlations decay fast enough so that the integral $\sigma^2 = \int_{-\infty}^{\infty} C_{F,F}(t)\,dt$ converges absolutely. Now we say that $$F$$ satisfies the Central Limit Theorem (CLT) for flows if $$(S_T-T\mu_F)/\sqrt{T}$$ converges in distribution to normal law $$N(0,\sigma^2)\ .$$

## History

• Ruelle (1968, 1976) and Sinai (1972), see also Bowen (1975), have proved that correlations decay exponentially fast and the central limit theorem holds for two (closely related) classes of systems and Holder continuous observables:
• Axiom A diffeomorphisms with Gibbs invariant measures;
• Topological Markov chains (also known as subshifts of finite type).
• Hofbauer and Keller (1982) and Rychlik (1983) extended these results to expanding interval maps with smooth invariant measures.
• In the 1990s the same results (exponential decay of correlations and Central Limit Theorem) were proved for systems with somewhat weaker chaotic behavior (characterized by nonuniform hyperbolicity), such as quadratic interval maps (Young, 1992, Keller and Nowicki, 1992) and the Henon map (Benediks and Young, 2000)
• In the 1990s these results were extended to chaotic systems with singularities by Liverani (1995) and (specifically to Sinai billiards in a torus) by Young (1998) and Chernov (1999).
• Young (1999) developed a powerful method to study correlations in systems with weak chaos where correlations decay at a polynomial rate.
• Young's method was applied to billiards with slow mixing rates, such as Sinai billiards in a square and Bunimovich billiards. Most notably, the correlations in the stadium were proven to decay as $$O(1/n)\ ;$$ the upper bound was derived by Markarian (2004) and the lower bound by Balint and Gouezel (2006).

## References

• Balint P. and Gouezel S. (2006) Limit theorems in the stadium billiard. Comm. Math. Phys. 263:461-512.
• Benedicks M. and Young L.-S. (2000) Markov extensions and decay of correlations for certain Henon maps. Asterisque 261:13-56.
• Bowen R. (1975) Equilibrium states and the ergodic theory of Anosov diffeomorphisms. Lect. Notes Math. 470, Springer-Verlag, Berlin, 1975.
• Chernov N. (1999) Decay of correlations and dispersing billiards. J. Stat. Phys. 94:513-56.
• Hofbauer F. and Keller G. (1982) Ergodic properties of invariant measures for piecewise monotonic transformations. Math. Z. 180:119-140.
• Keller G. and Nowicki T. (1992) Spectral theory, zeta functions and the distribution of periodic points for Collet-Eckmann maps. Commun. Math. Phys. 149:31-69.
• Liverani C. (1995) Decay of correlations. Annals Math. 142:239-301.
• Markarian R. (2004) Billiards with polynomial decay of correlations. Ergod. Th. Dynam. Syst. 24:177-197.
• Ruelle D. (1968) Statistical mechanics of a one-dimensional lattice gas. Commun. Math. Phys. 9:267-278.
• Ruelle D. (1976) A measure associated with Axiom A attractors. Amer. J. Math. 98:619-654.
• Rychlik M. (1983) Bounded variation and invariant measures. Studia Math. LXXVI:69-80.
• Sinai Ya. G. (1972) Gibbs measures in ergodic theory. Russ. Math. Surveys 27:21-69.
• Young L.-S. (1998) Statistical properties of dynamical systems with some hyperbolicity. Annals Math. 147:585-650.
• Young L.-S. (1999) Recurrence times and rates of mixing. Israel J. Math. 110:153-188.
• Denker M. (1989) The central limit theorem for dynamical systems. Dyn. Syst. Ergod. Th. Banach Center Publ. 23, Warsaw: PWN--Polish Sci. Publ.
• Hard Ball Systems and the Lorentz Gas, Ed. by D. Szasz (2000) Encycl. Math. Sciences, Vol. 101.

Internal references

• Eugene M. Izhikevich (2007) Equilibrium. Scholarpedia, 2(10):2014.
• David H. Terman and Eugene M. Izhikevich (2008) State space. Scholarpedia, 3(3):1924.