# Kolmogorov-Sinai entropy

Post-publication activity

Curator: Yakov Sinai

## Definition

We shall start by giving the definition of the entropy of dynamical system. Consider dynamical systems with discrete time. The phase space of dynamical system is denoted by $$M\ .$$ It is equipped with $$\sigma$$-algebra $$\mathcal M$$ and a probability measure $$\mu$$ defined on $$\mathcal M\ .$$ In the general ergodic theory dynamics is given by a measurable transformation $$T$$ of $$M$$ onto itself preserving the measure $$\mu\ .$$ It is enough for many applications to assume that $$M$$ is the Lebesgue space, i.e., it is isomorphic (as a measure space) to the unit interval with the usual Lebesgue measure. The concept of Lebesgue space was introduced in the works of Halmos, von Neumann and Rokhlin and now is widely used.

Take a finite partition $$\xi = \{ C_1 , C_2 , \ldots , C_r \}$$ of $$M\ .$$ It generates a stationary random process of probability theory with values $$1 , 2 , \ldots , r$$ if one uses the formula

$w_k ( x ) \, = \, j \ \mbox{if} \ x \, \in \, T^{-k} \, C_j \, , - \infty \, < \, k \, < \, \infty \, .$

Thus, each $$x$$ generates a realization of the random process $$w ( x ) = \{ \ldots \, w_{-n} ( x ) , \ldots , \, w_0 ( x ) ,$$ $$w_1 ( x ) \, , \ldots \,, w_m ( x ) \ldots \}\ .$$ The so-called Shannon-MacMillan theorem tells us that there exists the limit $h ( T , \xi ) \, = \, \lim\limits_{n \, \longrightarrow \, \infty} \, \frac{1}{n} \, \sum\limits_{i_1 , \ldots , i_n} \, \mu ( T^{-1} C_{i_1} \, \cap \ldots \cap \, T^{-n} C_{i_n} ) \, \ell n \, \mu ( T^{-1} C_{i_1} \cap \ldots \cap \, T^{-n} C_{i_n} ) \, .$

Definition 1.  Entropy of dynamical system $h ( T ) = \sup\limits_{\xi} \, h ( T , \xi )$ where $$\sup$$ is taken over all finite partitions $$\xi\ .$$

It is clear from the definition that this entropy is a metric invariant of dynamical system. The following theorem is the main tool which allows to compute $$h ( T )\ .$$ It uses the notion of generating partition.

Definition 2.  A partition $$\xi$$ is called generating partition (or generator) of the dynamical system $$( M , {\mathcal M} , \mu , T)$$ if the smallest $$\sigma$$-algebra containing all $$T^n C_j , - \infty < n < \infty ,$$   $$1 \, \leq \, j \, \leq \, r\ ,$$ is $${\mathcal M}\ .$$

Theorem 1.  If $$\xi$$ is a generating partition then $$h ( T ) = h ( T , \xi)\ .$$

This theorem was proven for Bernoulli partitions by Kolmogorov in his lectures. The proof was based upon the formula for entropy of Bernoulli partitions. The proof in the general case was given in [S1]. It used an inequality for conditional entropies which is the equality in the Bernoulli case. However, at that time when the notion of entropy appeared the transition from equality to inequality was a serious step.

Entropy can be defined for dynamical systems with continuous time because $h ( T^t) = | t | h ( T^1)\ ,$ $$- \infty < t < \infty\ .$$ For special flows the so-called Abramov formula connects the entropy of the flow with the entropy of the base automorphism.

The exposition of entropy theory of dynamical systems can be found in many monographs and textbooks, see e.g., [B], [CFS], [P], [W]. Now many examples of dynamical systems with positive entropy are known even within the class of deterministic dynamical systems. General theorem of Kushnirenko states the entropy of flow generated by a smooth vector field on a compact smooth manifold is finite.

## History

The notion of Metric Entropy of dynamical system, also known as Measure-Theoretic Entropy, Kolmogorov Entropy, Kolmogorov-Sinai Entropy, or just KS entropy, appeared in the paper by Kolmogorov ([K1]). This was time when Kolmogorov was interested and worked on several problems from information theory, dimension of functional spaces and so on. The main theorem of KAM-theory appeared in the works of Kolmogorov a little bit earlier. In 1957 Kolmogorov led a seminar on dynamical systems which was attended by such people as Alexeev, Arnold, Tikhomirov, Pinsker, Meshalkin, the author of the present paper and others.

The problem of metric isomorphism of dynamical systems was certainly one of the most often discussed. In his lectures accompanying the seminar, Kolmogorov presented a probabilistic proof of the von Neumann theorem on isomorphism of dynamical systems with pure point spectrum. The main point of view at that time was that dynamical systems arising in probability theory are different even from the metric point of view from dynamical systems generated by ODE and PDE. Motivated by all these ideas, Kolmogorov proposed the notion of entropy about which it was believed that it will allow to distinguish "probabilistic" dynamical systems and "deterministic" dynamical systems. The first announcement of the entropy was done by Kolmogorov in one of his lectures. It contained the metric invariant for Bernoulli shifts and gave the proof that 2-shifts and 3-shifts are metrically non-isomorphic. However, in the text prepared for publication the exposition was quite different. Kolmogorov introduced a new class of dynamical systems which he called quasi-regular and defined the notion of entropy only for quasi-regular systems.

Quasi-regular dynamical systems resembled regular stationary processes of probability theory which were studied earlier in the theory of random processes. Later, for some time, quasi-regular dynamical systems were called Kolmogorov systems. However, at some moment Kolmogorov asked to change this terminology because he hoped to work in this area and it would be inconvenient for him to use the words like "Kolmogorov system." The change was done and the term K-system is now commonly accepted. However, Kolmogorov did not return to this field. The notion of K-system plays an important role in ergodic theory.

Having written the text and submitting it for publication, Kolmogorov left from Moscow to Paris where he spent the whole semester.

At that time, the author of this paper was already a graduate student and Kolmogrov was his advisor. It was natural to start to think about the notion of entropy which could be applied to all dynamical systems. After several attempts the definition which was given above was invented. However, there were no non-trivial examples of dynamical systems to which this definition could be applied. During this time V.A. Rokhlin became aware of Kolmogorov's paper and his general definition of entropy. Rokhlin became very excited and proposed to compute the entropy of the automorphism of the two-dimensional torus. Following the main point of view at that time, the author tried to prove that the entropy is zero because the automorphism of the torus is certainly a purely "deterministic" dynamical system. The proof didn't go through. Kolmogorov was the first to suggest that the entropy must be positive. After that, it became possible to prove the needed result and there were enough reasons to publish the general definition of entropy together with the formula for the entropy of the torus automorphism. This was done in [S1] and the title of the paper was "On the notion of entropy of dynamical system." Some time later, Rokhlin found an example which showed that the entropy introduced by Kolmogorov is not a metric invariant of dynamical system. This example was reproduced in the paper by Kolmogorov (see [K2]). Now there are many examples of this kind. (See the paper[LPS] by E. Lindenstrauss, Y. Peres, W. Schlag). However, the result proven by Kolmogorov in his lecture was correct!

The financial support from NSF, Grant #DMS 0600996 is highly appreciated.

## References

• P. Billingsley, Ergodic Theory and Information, (1965), J. Wiley, 106p.
• I.P. Cornfeld, S.F. Fomin and Ya.G. Sinai, Ergodic Theory, (1981), Springer-Verlag.
• A.N. Kolmogorov, New Metric Invariant of Transitive Dynamical Systems and Endomorphisms of Lebesgue Spaces, Doklady of Russian Academy of Sciences, (1958), 119, N5, 861-864.
• A.N. Kolmogorov, Entropy per unit time as a metric invariant of automorphism, Doklady of Russian Academy of Sciences, (1959), 124, 754-755.
• E. Lindenstrauss, Y. Peres and W. Schlag, Bernoulli convolutions and intermediate values for entropy of $$K$$-partitions, (2002), J. Anal. Math., 87, 337-367.
• W. Parry, Entropy and Generators in Ergodic Theory, (1969), W.A. Benjamin, Inc., New York, Amsterdam, 124p.
• Ya.G. Sinai, On the Notion of Entropy of a Dynamical System, Doklady of Russian Academy of Sciences, (1959), 124, 768-771.
• P. Walters, An Introduction to Ergodic Theory, (1969), Springer-Verlag, New York-Berlin, 250p.

Internal references

• Tomasz Downarowicz (2007) Entropy. Scholarpedia, 2(11):3901.
• David H. Terman and Eugene M. Izhikevich (2008) State space. Scholarpedia, 3(3):1924.
• Brian Marcus and Susan Williams (2008) Symbolic dynamics. Scholarpedia, 3(11):2923.