Butterfly effect

From Scholarpedia
Catherine Rouvas-Nicolis and Gregoire Nicolis (2009), Scholarpedia, 4(5):1720. doi:10.4249/scholarpedia.1720 revision #137268 [link to/cite this article]
Jump to: navigation, search
Post-publication activity

Curator: Gregoire Nicolis

Figure 1: Evolution (in steps of 5 years) of the one-day forecast error in meters (dashed line) and doubling time of the initial error in days (full line) of the 500\(hPa\) Northern Hemisphere winter geopotential height - a representative measure of the state of the atmosphere - as obtained from the ECMWF operational weather forecasting model.

The Butterfly Effect is a concept invented by the American meteorologist Edward N. Lorenz (1917-2008) to highlight the possibility that small causes may have momentous effects. Initially enunciated in connection with the problematics of weather prediction it became eventually a metaphor used in very diverse contexts, many of them outside the strict realm of science.

Contents

A brief history

On December 29, 1972 Lorenz presented a talk in the 139th meeting of the American Association for the Advancement of Science held in Washington, D.C. entitled

Predictability: Does the Flap of a Butterfly's Wings in Brazil Set a Tornado in Texas?

The principal message of the talk was that the behavior of the atmosphere is unstable with respect to perturbations of small amplitude. In its frailty the butterfly - actually introduced in the title by the convener of the session, unable to reach Lorenz at the time of the release of the program - seemed to provide the ideal illustration of smallness, as opposed to the overwhelming character of phenomena like tornados encountered in our natural environment and interfering decisively with our everyday experience.

From the very start Lorenz was fully aware of the danger of confusions that might potentially arise by such a title, in view of the disproportionateness between the butterfly and the tornado. He stressed that in its literal form the question formulated in the title was by no means claimed to have an affirmative answer. The issue was, rather, whether two particular weather situations differing by as little as the influence of the flap of the wings of a single butterfly will experience in the long run two different sequences of occurrences of events of a certain type (such as tornados): what matters, in the end, is thus the instability of the atmosphere. Lorenz also warned against the confusion of identifying the butterfly effect and the butterfly looking strange attractor he had discovered in 1963, when studying the three-mode truncation of the Boussinesq equations describing the Rayleigh-Bénard flow beyond the thermal convection threshold.

As it often happens in the history of ideas the possibility that small causes may have large effects in general and in the context of weather in particular was anticipated by a number of researchers before Lorenz, from Henri Poincaré to Norbert Wiener. The merit of Lorenz's work has been to put the concept of instability of the atmosphere on a solid, quantitative basis and to link it to the properties of large classes of systems undergoing nonlinear dynamics and deterministic chaos.

Error growth and the prediction of complex systems

A physical system such as the atmosphere is inevitably subjected to small uncertainties in the initial conditions, that need to be specified when running a model providing information on its future evolution. Such uncertainties are inherent in the process of experimental measurement, which even in its most sophisticated form is limited by a finite precision. An observation under given ambient conditions entails thus that instead of a single system represented by an isolated point in the state space (the phase space, in the terminology of dynamical systems theory) one deals in reality with an ensemble of systems contained within an uncertainty ball occupying a finite volume in this space. The system of interest lies somewhere inside this ball but we are unable to specify its exact position, since for the observer all of its points represent one and the same state. In short, physical systems are subjected to a universal source of perturbations related to the presence of initial errors. The question is, then, whether they will respond by keeping errors under control (in which case they will be deemed to be stable) or, on the contrary, they will amplify them in the course of their evolution (in which case they will be deemed to be unstable).

For the atmosphere, this question can be answered by resorting to realistic models describing the evolution of the relevant atmospheric fields in time. One of the best known and most widely used operational models of this kind is the model developed at the European Centre for Medium Range Weather Forecasts (ECMWF), designed to produce weather forecasts in the range extending from a few days to a few weeks. This involves daily preparation of a \(N\)-day forecast of the global atmospheric state (typically, \(N=10\) days), using the present day's state as the initial condition. Since the equations are solved by stepwise integration, intermediate range (1, 2, \(\cdots\)) day forecasts are routinely achieved as well. Capitalizing on the fact that the model produces rather good 1-day forecasts, one expects that the state predicted for a given day, 1 day in advance, may be regarded as equal to the state subsequently observed on that day, plus a relatively small error. By comparing the 1- and 2- day forecasts for the following day, the 2- and 3- day forecasts for the day after, and so on, one can then determine upon averaging over several consecutive days how the mean error evolves in time.

The result for the period 1982-2002 (in steps of 5 years), summarized in Figure 1, establishes the presence of error growth as a result of sensitive dependence on the initial conditions in the atmosphere. What is more this dependence leads here to the strongest possible form of growth, namely, exponential growth. The full line in the figure depicts the doubling time of the initial error. In 1982 this value was about 2 days, as determined by Lorenz himself in a seminal work in which the ECMWF data were first analyzed in this perspective. As can be seen, this value has dropped to about 1.2 days in 2002. This decrease appears at first sight as paradoxical, since during this 20-year period one has witnessed significant technological advances such as an increase of spatial resolution, a substantial improvement of parameterization schemes and an almost three-fold decrease of initial (1-day forecast) error (dashed line in Figure 1). It reflects the fact that although the accuracy of forecasts for a few days ahead is considerably increasing detailed forecasting of weather states at sufficiently long range with models of increasing sophistication may prove impracticable, owing to the complexity inherent in the dynamics of the atmosphere. Part of this complexity is due to the coexistence of processes unfolding on a wide range of space and time scales: errors in the coarser structure of a weather pattern would tend to double much more slowly than errors in the finer structure (e.g., the positions of individual clouds) if it were not for the fact that the latter, having attained an appreciable size, tend to induce errors in the coarser structure as well.

Figure 2: Illustration of the phenomenon of sensitivity to the initial conditions in a model system giving rise to deterministic chaos. Black line denotes the trajectory of the reference system. Red and blue lines denote the trajectories emanating from two initial conditions differing from the reference trajectory by \(\epsilon_1=10^{-3}\) (red curve) and \(\epsilon_2=10^{-2}\) (blue curve).

Clearly, as soon as the distance between two instantaneous states separated initially by a very small error will exceed the experimental resolution the states will cease to be indistinguishable for the observer. As a result, it will be impossible to predict the future evolution of the system at hand beyond this temporal horizon. This raises the fundamental question of predictability of the phenomena underlying the behavior of the atmosphere.

Now, the exponential sensitivity to the initial conditions turns out to be the principal signature of deterministic chaos, a well-known behavior underlying large classes of deterministic dynamical systems governed by nonlinear evolution laws. This opens the way to an analysis of error growth and the understanding of the butterfly effect in the atmosphere using tools from chaos theory. Furthermore, since deterministic chaos is known to occur not only in systems involving large numbers of intricately coupled variables but also in ordinary looking systems amenable to few variables, error growth and butterfly type effects would in fact appear to be concepts of universal validity. This rules out once for all the idea that the butterfly effect could reflect incomplete knowledge of the atmosphere in connection with the presence of huge numbers of variables (up to \(10^7\) or so for the ECMWF forecasting model) and parameters masking some underlying regularities: systems obeying to evolution laws known to their least detail, subjected to perfectly well controlled parameters, could still turn out to be unpredictable beyond a certain temporal horizon.

The black line of Figure 2 depicts a time series generated by a prototypical 1-variable discrete time dynamical system giving rise to chaotic behavior. The red and blue colored lines of this figure correspond to the succession in time of variable \(x\) emanating from two initial conditions differing from the reference trajectory (black line) by errors of amplitudes \(\epsilon_1=10^{-3}\) and \(\epsilon_2=10^{-2}\ ,\) respectively. As can be seen, the reference and perturbed trajectories practically follow each other up to a certain time \(t^*_i\) which decreases with \(\epsilon_i (i=1,2)\) and are subsequently diverging by amounts comparable to the entire range of variation of \(x\ .\)

Using the data of the figure one may deduce the instantaneous value of the error

\[ u_t(\epsilon, x_0)=x(t, x_0+\epsilon)-x(t, x_0) \]

For a given \(x_0\ ,\) this quantity displays a very pronounced variability. Averaging over all possible initial states \(x_0\) compatible with the dynamics (keeping the values of \(\epsilon_i\) as above) leads to the logistic-like mean quadratic error growth curve of Figure 1. Three different stages may be distinguished: an initial (short time) induction stage during which errors increase exponentially but remain (for \(\epsilon\) small enough) small; an intermediate explosive stage displaying an inflexion point situated at a value \(t^*\) of \(t\) depending logarithmically on \(\epsilon\ ,\) \(t^*\approx \ln1/\epsilon\) where errors suddenly attain appreciable values; and a final stage, where the mean error reaches a saturation level of the order of the size of the attractor and remains constant thereafter. The mechanism ensuring this saturation is the reinjection of the trajectories that would first tend to escape owing to the instability of motion, back to a subset of phase space that is part of the attractor.

Figure 3: Time dependence of the mean quadratic error of the system of Fig. 2 starting from 100,00 initial conditions scattered on the attractor and two different initial quadratic errors, \(\epsilon_1=10^{-6}\) and \(\epsilon_2=10^{-4}\ .\) \(t_1^*\) and \(t_2^*\) stand for the location of the inflexion points characterizing the "explosive" stage of the growth of errors.

Notice that the initial states (reference as well as perturbed) considered in Figure 2 and Figure 1 lie on a single and same attractor, uniquely defined once the evolution law and the parameter values are specified. The trajectories depicted in Figure 2 describe thus the same type of behavior, differing only in the way events succeed in time. This substantiates Lorenz's warning (cf. preceding section) on how the butterfly effect is to be understood. An additional important feature pertains to the weak (logarithmic) dependence of the error explosion time \(t^*\) on the initial value \(\epsilon\ :\) decreasing \(\epsilon\) from, say, \(10^{-2}\) to a value like \(10^{-12}\) close to a typical thermal fluctuation relative to the mean, would only increase \(t^*\) from about 5 to about 30. Naturally, once on scales as small as those for which thermal noise begins to be manifested other effects are likely to take over and mask the butterfly effect.

The dynamical systems connection outlined above also allows one to identify a number of intrinsic quantities, determined entirely by the evolution law and the parameter values, providing quantitative measures of the butterfly effect. Most prominent among them is the (maximum) Lyapunov exponent, defined as

\[ \sigma_{max}=\lim_{t\rightarrow \infty, \epsilon \rightarrow 0} \frac{1}{t} \ln \frac{u_t(\epsilon,x_0)}{\epsilon} \]

in the double limit of (in the indicated order) infinitely small initial errors \(\epsilon\) and infinitely long times \(t\ .\) In this setting error growth and butterfly effect are consequences of the positivity of \(\sigma_{max}\ ,\) and \(\sigma_{max}^{-1}\) (along with \(t^*\) above) defines the time horizon beyond which predictions become essentially random.

Typical dynamical systems live in a multi-dimensional phase space and possess thus several Lyapunov exponents, some of which are negative. For short times all these exponents are expected to take part in the error dynamics. Since a typical attractor associated to a chaotic system is fractal, a small error displacing the system from an initial state on the attractor may well project it outside the attractor. Error dynamics may then involve a transient stage prior to the re-establishment of the attractor, during which errors would decay in time.

An important class of multivariate systems are spatially extended systems, Here it is often convenient to expand the quantities of interest in terms of the different spatial scales along which the phenomenon of interest can develop and, in particular, the different scales along which an initial error can occur. The ideas outlined above imply, then, that the predictability properties of a phenomenon depend on its spatial scale. In summary, error growth dynamics is subjected to strong variability since not all initial errors grow at the same rate. As a result, the different predictability indexes such as \(\sigma_{max}\ ,\) the saturation level and the time \(t^*\) to reach the inflexion point provide only a partial picture, as in reality the detailed evolution depends upon the way the different possible error locations and directions are weighted.

A situation worth mentioning is that of errors increasing in a subexponential fashion, e.g., as powers of \(t\ .\) In a chaotic dynamical system this happens transiently along the directions associated to its vanishing Lyapunov exponents, prior to the stage where the directions associated to the positive exponents are taking over. In non-chaotic systems such as systems undergoing non-uniform periodic or quasi-periodic motion power law transient growth is the rule, and is associated to increasingly large phase shifts between the reference and the perturbed system. Such errors do not count when dealing with the butterfly effect: on the average their magnitude will remain close to the initial value \(\epsilon\ ,\) although in some particular realizations one may temporarily witness an abrupt growth stage.

Initial errors, model errors and environmental variability

Much like experiment, the modeling of a physical phenomenon has also its limitations. First, once a certain level of description is chosen small scale processes (like e.g. local turbulence in the context of atmospheric dynamics) are automatically overlooked, since they exceed the adopted (finite) resolution. Furthermore, many of the parameters built in the model may not be known to a great precision. In addition to initial errors prediction must thus cope with model errors, reflecting the fact that a model is only an approximate representation of reality. The key question is, then, to what extent model errors will be amplified in time to a point compromising the quality of the prediction. This raises the problem of a parametric butterfly effect associated with the sensitivity of a system with respect to changes of the underlying evolution laws, referred to in nonlinear dynamics as structural stability. We emphasize that if the dynamics were simple, initial or model errors would not matter. But this is not manifestly the case in large classes of systems. Initial and model errors can thus be regarded as probes revealing the underlying instability and complexity of the system at hand.

Natural complex systems like the atmosphere can reasonably be expected to be structurally stable, as they are the result of an evolution during which they have adapted to the ambient conditions. From the standpoint of predictability this means that the attractors of the reference, real system and of the approximate model will have similar structures and lie relatively close to each other in phase space, differing only in their quantitative properties. The relevant question here is, then, how the perturbed system (the model) will deviate as time proceeds from its initial state on the unperturbed attractor (the real system) prior to reaching the final attractor. Theoretical developments along with simulations on model systems lead to the following conclusions:

  • In the short time regime, mean quadratic model errors start at zero level and increase as \(t^2\ ,\) with a proportionality coefficient depending on the magnitude of the error in the parameters or of the terms omitted given the adopted resolution. Contrary to initial errors (which start with a nonzero value at \(t=0\)), instability of motion and the largest Lyapunov exponent in particular do not play here a crucial role.
  • For longer times model errors grow much like in the curve of Figure 1. Interestingly, the saturation level attained is again finite, practically independent of the quality of the model, as it reflects to first approximation the average of typical quadratic distances between any two points of the reference attractor which tend to be increasingly phase shifted time going on.

In summary, the initial stage of the dynamics of global (initial plus model) errors is bound to be dominated by the growth of initial errors, since model errors are initially zero. For long times both initial and model errors attain a finite level, depending on the nature of the attractor of the reference system. As a rule between these two extremes one witnesses a crossover between the growth of the two types of error, as illustrated in Figure 3. Beyond the crossover time \(\overline{t}\ ,\) then, the classical butterfly effect is superseded by an effect reflecting the sensitivity of the evolution laws themselves towards small errors. As models play an essential role in most forecasting schemes, this constitutes an additional irreducible limitation in the prediction of complex systems.

Figure 4: Typical time dependence of the mean quadratic error of a model dynamical system giving rise to deterministic chaos starting from 100,00 initial conditions scattered on the attractor in the presence of initial condition errors (red line), model errors (blue line) and both initial condition and model errors (green line). \(\overline{t}\) denotes the crossover time whereby both sources of errors attain a comparable magnitude.

In the preceding discussion it was understood that the values of characteristic parameters (real or model ones) remained fixed. There is growing interest in the response of a complex system in general and of the atmosphere in particular to external forcings and varying parameters - for instance, as a result of anthropogenic effects. The main results on the status of the butterfly effect under these conditions can be summarized as follows:

  • External forcings of even weak amplitude may induce qualitatively new effects in the form of enhanced sensitivity (stochastic resonance, etc.) or of transitions between states that would otherwise remain separated. This complicates further the task of prediction.
  • A systematic slow variation of a parameter in time can enhance the stability of a state that would otherwise tend to undergo an instability and can even lead to situations where the system becomes for all practical purposes frozen in a state that could otherwise not be sustained. At the same time, however, the fluctuations around the mean tend to increase and as a result the occurrence of extreme events is enhanced.
  • For very short times the effect of stochastic forcings (however small, including thermal noise) dominates over that of the (deterministic) evolution laws. As a result the stage of exponential growth characteristic of deterministic chaos occurs only beyond some characteristic time depending on the noise strength, as initially quadratic errors grow only linearly in time. During this time regime the butterfly effect is, then, attenuated.

Taming the butterfly: the probabilistic approach to prediction

The sensitivity and intrinsic randomness of complex systems symbolized by the butterfly effect signals the limitations of the traditional deterministic description, in which one focusses on a detailed, pointwise evolution of individual trajectories. Now as seen earlier, owing to the finite precision of the process of measurement in nature an instantaneous state is in reality to be understood as a small region in phase space. In the presence of the butterfly effect this region will subsequently be deformed and the individual points within it will be increasingly delocalized. To the observer this will signal the inability to predict the future beyond a certain transient period on the basis of the knowledge of the present conditions. These elements constitute a compelling motivation for searching for an alternative description capable of coping in a natural fashion with irregular successions of events, delocalization in state space and build-in uncertainties. The probabilistic approach offers this natural alternative.

A fundamental point is that the evolution of systems composed of several subunits and undergoing complex dynamics can be mapped into a probabilistic description in a self-consistent manner, free of heuristic approximations. The probabilistic and deterministic views become thus two facets of the same reality, and this allows one to sort out regularities of a new kind.

One of the novelties brought by the probabilistic description is that the evolution of the underlying probability distributions (described by Liouville, master or Fokker-Planck equations) - which now become the principal quantities of interest - is linear and displays strong stability and uniqueness properties. This is in sharp contrast with the deterministic description in which nonlinearity and instability are prominent. As we see presently, this provides the basis of a new approach to prediction. When implemented on a mathematical model representing a concrete system like e.g. the atmospheric circulation the probabilistic approach amounts to choosing a set of initial conditions compatible with the available data; to integrate the model equations for each of these initial conditions; and to evaluate the averages (or higher moments) of the quantities of interest over these individual realizations. In the context of atmospheric dynamics this procedure is known as ensemble forecasts. Its principal merit is to temper the strong fluctuations associated with a single realization and to sort out quantitative trends in relation with the indicators of the intrinsic dynamics of the system at hand. Figure 1 illustrates schematically the nature of ensemble forecasts. The full circle in the initial phase space region \(\delta \Gamma_0\) stands for the best initial value available. Its evolution in phase space, first after a short lapse of time (region \(\delta \Gamma_1\)) and next at the time of the final forecast projection (region \(\delta \Gamma_2\)) is represented by the red line. Now, the initial position is only one of several plausible initial states of the atmosphere, in view of the errors inherent in the analysis. There exist other plausible states clustered around it, represented in the figure by open circles. As can be seen, the trajectories emanating from these ensemble members (blue lines) differ only slightly at first. But between the intermediate and the final time they diverge markedly, presumably because the predictability time has been exceeded: there is a subset of the initial ensemble including the best guess that produces similar forecasts, but the remaining ones predict a rather different atmospheric state. This dispersion is indicative of the uncertainty of the forecast. It constitutes an important source of information that would not be available if only the best initial condition had been integrated, especially when extreme situations are suspected to take place in a very near future. It should be pointed out that such uncertainties frequently reflect local properties of the dynamics such as local expansion rates and the orientation of the associated phase space directions.

Figure 5: Illustrating the nature of ensemble forecasts. The phase space regions \(\delta \Gamma_0\ ,\) \(\delta \Gamma_1\) and \(\delta \Gamma_2\) represent three successive snapshots of an ensemble of nearby initial conditions (left) as the forecasting time increases. The red line represents the traditional deterministic single trajectory forecast, using the best initial state as obtained by advanced statistical and data analysis techniques. The blue lines represent the trajectories of other ensemble members, which remain close to each other for intermediate times (middle) but subsequently split into two subensembles (right), suggesting that the deterministic forecast becomes unrepresentative. Notice the deformation of the phase space volumes accompanying the underlying instability.

The probabilistic approach can also be applied for developing predictive models on the sole basis of data. An example of how this is achieved pertains to the transition between atmospheric regimes such as the onset of drought. The main idea is that to be compatible with such data, the underlying system should possess (as far as its hydrological properties are concerned) two coexisting attractors corresponding, respectively, to a regime of quasi-normal precipitation and a regime of drought. In a deterministic setting the system would choose one or the other of these attractors depending on the initial conditions and would subsequently remain trapped therein. In reality under the influence of the fluctuations generated spontaneously by the local transport and radiative mechanisms, or of the perturbations of external origin such as, for instance, surface temperature anomalies, the system can switch between attractors and change its climatic regime. This is at the origin of an intermittent evolution in the form of a small scale variability around a well-defined state followed by a jump toward a new state, which reproduces the essential features of the record. The idea can of course apply to a host of other problems, including the transition between the well-known zonal and blocked atmospheric flows. In their quantitative form the models belonging to this family appear in the form of evolution equations for the underlying probability distributions, from which a number of relevant quantities such as the lifetime of a given atmospheric regime can be evaluated.

Butterfly effect, causality and chance

The ubiquity of the butterfly effect in large classes of complex systems prompts one to reflect on the connection between two concepts that have been regarded as quite distinct throughout the history of science and of ideas in general, namely, causality and chance.

Classical causality relates two qualitatively different kinds of events, the causes and the effects, on which it imposes a universal ordering in time. From the early Greek philosophers to the founders of modern science causality has been regarded as a cornerstone, guaranteeing that nature is governed by objective laws and imposing severe constraints on the formulation of theories aiming to explain natural phenomena. Technically, causes may be associated to the initial conditions on the variables describing the system, or to the constraints (more generally, the parameters) imposed on it. In a deterministic setting this fixes a particular trajectory (more generally, a particular behavior) and it is this unique cause to effect relationship that constitutes the expression of causality and is ordinarily interpreted as a dynamical law.

But suppose that one is dealing with a complex system displaying sensitivity to the initial conditions as it occurs in deterministic chaos or sensitivity to the parameters and to modeling errors in general. Minute changes in the causes produce now effects that look completely different from a deterministic standpoint, thereby raising the question of predictability of the system at hand. Clearly, under these circumstances the causes acquire a new status. Without putting causality in question, one is lead to recognize that its usefulness in view of making predictions needs to be reconsidered. It is here that statistical laws offer a natural alternative. While being formally related to the concept of chance, the point stressed in the previous section is that they need not require extra statistical hypotheses: when appropriate conditions on the dynamics are fulfilled statistical laws are emergent properties, that not only constitute an exact mapping of the underlying (deterministic) dynamics but also reveal key features of it that would be blurred in a traditional description in terms of trajectories. In a sense one is dealing here with a deterministic randomness of some sort. In fact, the equations governing the probability distributions associated to a complex system, are deterministic and causal as far as their mathematical structure is concerned: they connect an initial probability (the cause) to a time-dependent or an asymptotic one (the effect) in a unique manner. By its inherent linearity and stability properties the probabilistic description re-establishes causality and allows one to still make predictions, albeit in a perspective that is radically different from the traditional one.

From facts to fiction

The concept of the butterfly effect refers to a real world phenomenon of universal bearing, well beyond the framework of atmospheric physics in which it was initially proposed. It highlights the fact that science is not in the position to predict everything once sufficient information is gathered, owing to the existence of intrinsic limitations. In this respect, it has contributed to the advent of a new, post-Newtonian scientific paradigm nowadays referred to as the complexity paradigm.

Little things in the past can also make big differences in everyday life when ideas and trends cross a threshold, tip and spread as discussed in an interesting book by Malcolm Gladwell. The author, correctly, makes no attempt whatsoever to relate this type of phenomenon to the butterfly effect in the sense of Lorenz. In a similar vein biological evolution and Darwinian adaptation can be viewed as the accumulation of small changes over a long period, which at some stage produce momentous effects that cannot be predicted in advance. Again, this is not to be confused with the butterfly effect. Rather, as pointed out by some authors we are here actually in the impossibility to know in advance the state space of the evolving biosphere.

The sensitivity of large classes of systems and the concomitant difficulty to issue long term predictions is a well established fact in very diverse fields beyond the strict realm of physical science, such as sociology and finance. The butterfly effect constitutes here a powerful analogy that can be used fruitfully to raise questions and to transpose techniques that would otherwise be impossible to imagine. The picture begins, unfortunately, to be blurred from the moment one switches from facts to metaphors invoked in an uncontrollable way. This is what happened repeatedly in the last decades, when the butterfly effect was transposed in mass culture to explain that a chain of events of apparently no importance can change History and forge destinies. Most if not all of these transpositions are, simply, dubious science and what is more, are highly misleading for the public to which they are addressed. Indeed, the essence of the butterfly effect is that, on the contrary, following small changes in the past one would never be in a position to fully evaluate the consequences for the present in view of the highly complex and intricately correlated sequences of events separating the reference and the modified paths.

Summing up

Classical science has emphasized stability and permanence. Developments spanning the last decades show, on the contrary, that instability, sensitivity and unpredictability underlie large classes (if not most) of phenomena occurring on macroscopic time and space scales - the scales of our everyday experience. There is a need for the decision makers, for the public and even for a part of the scientific community to adapt to this state of affairs and to the modes of reasoning required by it. Many systems of concern, from the atmosphere to the stock market need to be observed, monitored, modeled and predicted in a way that does justice to their intrinsic complexity otherwise essential features are likely to be missed. The butterfly effect stands as a symbol of this new rationality.

References

History of the butterfly effect concept

  • P. Duhem, La Théorie physique: son objet, sa structure, Marcel Riviére, Paris (1906).
  • J. Hadamard, Les surfaces á courbures opposées et leurs lignes géodésiques, J. Math. Pures et Appl. 4, 27-73 (1898).
  • R. Hilborn, Sea gulls, butterflies and grass shoppers: a brief history of the butterfly effect in nonlinear dynamics, Amer. J. Phys. 72, 425-427 (2004).
  • E. N. Lorenz, Deterministic non-periodic flow, J. Atmos. Sci. 20, 130-141 (1963).
  • E. N. Lorenz, The essence of chaos, University of Washington Press (1993).
  • H. Poincaré, Science et méthode, Flammarion, Paris (1908).
  • N. Wiener, Nonlinear prediction and dynamics, in Proc. 3rd Berkeley Symp. on Math. Statistics and Probability, Vol. 3, University of Berkeley Press (1954).

Error growth and predictability

  • E. N. Lorenz, Atmospheric predictability as revealed by naturally occurring analogues, J. Atmos. Sci. 26, 636-646 (1969).
  • E. N. Lorenz, Atmospheric predictability experiments with a large numerical model, Tellus, 34, 505-513 (1982).
  • C. Nicolis, Probabilistic aspects of error growth in atmospheric dynamics, Q.J.R. Meteorol. Soc. 118, 553-568 (1992).
  • G. Nicolis and C. Nicolis, Foundations of complex systems, World Scientific, Singapore (2007).

Model error and time dependent forcings

  • C. Nicolis, Transient climatic response to increasing \(CO_2\) concentration: some dynamical scenarios, Tellus, 40A, 50-60 (1988).
  • C. Nicolis, Dynamics of model errors: some generic features, J. Atmos. Sci., 60, 2208-2218 (2003).
  • C. Nicolis, Dynamics of model errors: the role of unresolved scales, J. Atmos. Sci., 61, 1749-1753 (2004).
  • C. Nicolis and G. Nicolis, Passage through a barrier with a slowly increasing control parameter, Phys. Rev. E 62, 197-203 (2000).
  • G. Nicolis, Introduction to nonlinear science, Cambridge University Press, Cambridge (1995).

Probabilistic predictions

  • G. Demarée and C. Nicolis, Onset of Sahelian drought viewed as a fluctuation-induced transition, Q.J.R. Meteorol. Soc.116, 221-238 (1990).
  • E. Kalnay,Atmospheric modeling, data assimilation and predictability, Cambridge University Press, Cambridge (2003).
  • G. Nicolis and P. Gaspard, Towards a probabilistic approach to complex systems, Chaos, Solitons and Fractals, 4, 41-57 (1994).
  • Ya Sinai, Introduction to ergodic theory, Princeton University Press, Princeton (1977).

Beyond physical science

  • The Boston Globe, taken up in Le Monde, Courrier International n^0 936, October 9 to 15 (2008).
  • M. Gladwell, The tippling point, Abacus, London (2001).
  • S. Kauffman, Reinventing the sacred, Basic Books, New York (2008).

Internal references

  • Jan A. Sanders (2006) Averaging. Scholarpedia, 1(11):1760.
  • Olaf Sporns (2007) Complexity. Scholarpedia, 2(10):1623.
  • Gregoire Nicolis and Catherine Rouvas-Nicolis (2007) Complex systems. Scholarpedia, 2(11):1473.
  • Giovanni Gallavotti (2008) Fluctuations. Scholarpedia, 3(6):5893.
  • Philip Holmes and Eric T. Shea-Brown (2006) Stability. Scholarpedia, 1(10):1838.
  • David H. Terman and Eugene M. Izhikevich (2008) State space. Scholarpedia, 3(3):1924.
  • Catherine Rouvas-Nicolis and Gregoire Nicolis (2007) Stochastic resonance. Scholarpedia, 2(11):1474.
  • James Murdock (2006) Unfoldings. Scholarpedia, 1(12):1904.


Further reading

External links

See also

Chaos

Personal tools
Namespaces

Variants
Actions
Navigation
Focal areas
Activity
Tools