1/f noise

From Scholarpedia
Lawrence M. Ward and Priscilla E Greenwood (2007), Scholarpedia, 2(12):1537. doi:10.4249/scholarpedia.1537 revision #90924 [link to/cite this article]
Jump to: navigation, search
Curator and Contributors

1.00 - Priscilla E Greenwood

\(1/f\) noise refers to the phenomenon of the spectral density, \(S(f)\ ,\) of a stochastic process, having the form

\[S(f)=constant/f^ \alpha\ ,\]

where \(f\) is frequency, on an interval bounded away from both zero and infinity.

\(1/f\) fluctuations are widely found in nature. During 80 years since the first observation by Johnson (1925), long-memory processes with long-term correlations and \(1/f^{\alpha}\) (with \(0.5\lesssim \alpha \lesssim 1.5\)) behavior of power spectra at low frequencies \(f\) have been observed in physics, technology, biology, astrophysics, geophysics, economics, psychology, language and even music (see reviews by Press 1978, Hooge et al. 1981, Dutta and Horn 1981, Kogan 1985, Weissman 1988, West and Shlesinger 1990, Van Vliet 1991, Zhigalskii 1997, Milotti 2002, and Wong 2003).

Contents

Introduction

\(1/f\) noise is an intermediate between the well understood white noise with no correlation in time and random walk (Brownian motion) noise with no correlation between increments. Brownian motion is the integral of white noise, and integration of a signal increases the exponent \(\alpha\) by 2 whereas the inverse operation of differentiation decreases it by 2. Therefore, \(1/f\) noise can not be obtained by the simple procedure of integration or of differentiation of such convenient signals. Moreover, there are no simple, even linear stochastic differential equations generating signals with \(1/f\) noise. The widespread occurrence of signals exhibiting such behavior suggests that a generic mathematical explanation might exist. Except for some formal mathematical descriptions like fractional Brownian motion (half-integral of a white noise signal), however, no generally recognized physical explanation of \(1/f\) noise has been proposed. Consequently, the ubiquity of \(1/f\) noise is one of the oldest puzzles of contemporary physics and science in general.

The case of \(\alpha=1\ ,\) or pink noise, is both the canonical case, and the one of most interest, but the more general form, where \(0 < \alpha \leq 3\ ,\) is sometimes referred to simply as \(1/f\ .\) \(1/f^ \alpha\) noise is of interest because it occurs in many different systems, both in the natural world and in man-made processes.

The expression "long-range dependence", sometimes used to refer to \(1/f\) noise, has also been used in various other contexts with somewhat different meanings. "Long memory" and other variants are also sometimes used in the same way. For example, Wagenmakers, Farrell and Ratcliff (2004) used the expressions long-range dependence and \(1/f\) noise synonymously. On the other hand, in the paper of Granger and Ding (1988) certain models of long memory are studied in the vicinity of \(f = 0\ .\) But data sets are finite, and arbitrarily small \(f\) cannot be realized. For this reason, we confine our discussion to behavior outside a neighborhood of \(f = 0\ .\)

Often the discovery of \(1/f\) noise in a system has been taken to imply the existence of some other special structure such as self-organized criticality, or multiplicative noise. This inference is not warranted, as our discussion of various models and their contexts will make clear.

Figure 1 displays some time series and, in the same colors, their associated power spectra. Such time series arise in many natural systems. Figure 2 displays a few representative examples from physics, biology, neuroscience, and psychology, to which we will refer in what follows. In both figures, power spectra are plotted in log-log coordinates, as is customary, because \(log(S(F))=log(constant/f^{\alpha})=-\alpha log(f)+log(constant)\ .\) In other words, the logarithmic transform renders the \(1/f^{\alpha}\) power spectrum a straight line whose slope, \(-\alpha\ ,\) can be easily estimated. Clearly for such natural systems, observed by humans, neither arbitrarily small nor arbitrarily large frequencies can be recorded. In what follows we describe a sampling of models that are relevant to such physically plausible situations.

Figure 1: Left: color-coded realizations of time series of various noises. Right: respective power spectra of noises.

Early history

\(1/f\) noise was discovered by Johnson (1925) in data from an experiment designed to test Schottky’s (1918) theory of shot noise in vacuum tubes (Figure 2A). The noise in Johnson's experiment was not white at low frequency, and Schottky (1926) subsequently attempted to describe mathematically Johnson’s verbal explanation of the "flicker noise" he found, by assuming that an exponential relaxation,

\[N(t)=N_0e^{-\lambda t}\ ,\] \(t\geq 0\ ,\)

of a current pulse was caused by the release of electrons from the cathode of the vacuum tube. For a train of such pulses at an average rate \(n\) the power spectrum is

\[S(f)=\frac{N_0^2 n}{\lambda^2 + f^2}\]

which is nearly constant near \(f = 0\) and nearly proportional to \(1/f^2\) for large \(f\ ,\) with a narrow transition region where the power spectrum resembles that of the flicker noise found by Johnson. The form of Shottky’s expression for the power spectrum is called a “Lorentzian.” More will be said about the Lorentzian form in the section “Mathematics of \(1/f\) noise.” Bernamont (1937) pointed out that what was needed was a superposition of such processes with a variety of relaxation rates,\(\lambda\ .\) He showed that if \(\lambda\) is uniformly distributed between \(\lambda_1\) and \(\lambda_2\ ,\) the power spectral density is

\[ S(f) =\left \{ \begin{array}{ll} N_0^2 n & \textrm{if} \ 0 \ll f \ll \lambda_1 \ll \lambda_2\\ \frac{N_0^2n\pi}{2f(\lambda_2-\lambda_1)} & \textrm{if} \ \lambda_1 \ll f \ll \lambda_2 \\ N_0^2n\cdot\frac{1}{f^2} & \textrm{if} \ 0\ll \lambda_1 \ll \lambda_2 \ll f \end{array} \right. \ .\]

In other words \(S(f)\) is proportional to \(1/f\) for \( \lambda_1 \ll f \ll \lambda_2 \ .\) Somewhat later, McWhorter (1957) developed a more sophisticated model in which the noise was attributed to the trapping and detrapping of surface states.

In a departure from previous theories that emphasized physical properties, Bell (1960) proposed that such noise is a cooperative phenomenon arising from the statistics of electrons queuing as they flow through a wire. In Bell's theory, electrons flowing in a wire oscillate randomly between bound and excited states, queueing up for access to sites on the atoms of the wire. The probability distribution of queueing times for the conduction band of excited, free electrons flowing in the wire is a superposition of exponential distributions with approximately equal weighting of their widely varying time constants, giving rise to the equation just above for the power spectral density. Bell (1960) also showed that various departures of power spectra from \(1/f\ ,\) in particular, the dependence of the slope of the power spectrum on the particular conductor or semiconductor, can be explained by variations in how the summation arises or in the limits over which it occurs.

Figure 2: Examples of \(1/f\) noises. Curves are illustrative based on data from the indicated sources. Adjacent pairs of tick marks on the horizontal axis beneath each figure indicate one decade of frequency.

Examples of 1/f noise

1/f noise in solids, condensed matter and electronic devices

Low-frequency noise or flicker noise has been found in many systems and has become a hot research topic for more than eight decades. Most \(1/f\) noise studies were carried out on resistors, operational amplifiers or other electronic equipment and systems (http://www.nslij-genetics.org/wli/1fnoise/index.html). For an electronic system, it is easier to produce samples with different noise behaviors via different fabrication processes or different measurement conditions such as temperature, stressing, biasing etc. A special emphasis was placed on resistors. Hooge (1976, 1981, 1994) carried out a number of experiments in metal film and found that the noise of the voltage V, current I, conductance G and resistance R in conductors, semiconductors and other electronic devices can be characterized by

\[\frac{S_V\left( f\right)}{V^2}=\frac{S_I\left( f\right)}{I^2}= \frac{S_G\left( f\right)}{G^2}=\frac{S_R\left( f\right)}{R^2}=\frac{\alpha _H}{N_Cf},\]

where \(\alpha _H\) is a dimensionless constant (the Hooge parameter) and \(N_C\) is the number of charge carriers in the conductor. Although this formulation has been influential, and it does give a useful approximation to the power spectrum of electrical noise in many materials, Weissman (1988) reviewed evidence and arguments that it and related formulations cannot lead to a correct general theory of \(1/f\) noise. This is because in the Hooge approach, the fluctuations are tied to independent mobile charge carriers that do not persist in the material sample for long enough to generate the low frequency end of the \(1/f\) power spectrum.

Another important early study of \(1/f\) noise in semiconductors was done by Caloyannides (1974). He very carefully measured the power spectrum of voltage fluctuations through a semiconductor from around 1 Hz to \(10^{-6.3}\) Hz (Figure 2B). This required recording the voltage over a period of 3 months, reduced from what would have been 2.5 years by a variety of clever devices. He also greatly refined the process of computing the power spectrum for such noises, and proposed a model of \(1/f\) noise similar to that of McWhorter (1957).

Self-organized criticality

"Self-organized criticality" refers to the phenomenon whereby a dissipative dynamical system with many degrees of freedom operates near a configuration of minimal stability, the critical configuration, and does so without any fine tuning by an external driving influence. Favorite examples are sandpiles or mountains (re landslides), snow fields (re avalanches), and tectonic plates (re earthquakes). When the system is in the critical configuration, small fluctuations cause events of all sizes, s, with probability density \(D(s)\) a power function

\[ D(s)=ks^{-\tau}\ .\]

According to Bak, Tang, and Wiesenfeld (1987), self-organized critical dynamical systems give rise to \(1/f\) noise because the lifetime of an event, \(t\ ,\) is related to the size of the event, \(s\ ,\) by

\[t^{1+\gamma}\approx s \]

where \(\gamma\) is the rate at which the event propagates across the system. This particular model and many of its successors have been found to have some subtle inconsistencies, including a confusion between order and control parameters, and a more general, mean-field, approach has been developed that has corrected most of these inconsistencies (Vespignani & Zapperi, 1998). Importantly, real, nearly one-dimensional, "sandpiles" of rice grains do exhibit \(1/f\) noise (Maslov, Tang & Zhang, 1999). Moreover, a continuous, dissipative version of the Bak, Tang & Wiesenfeld (1987) model, with directional perturbation propagation, clearly demonstrates \(1/f\) noise in both 1- and 2-dimensional forms (De Los Rios & Zhang, 1999). In this model, \(\delta\)-correlated uniform random noise is added to a lattice at a point called the origin, and propagates across the lattice in only one direction from the origin when a critical value of the origin element is exceeded. The power spectrum of energy fluctuations in the total system is

\[S(f)=\sum_x{S(f,x)}=\int_0^L e^{\delta x} h(fe^{\delta x}) dx=\frac{1}{\delta f} \int_0^{fe^{\delta L}} h(y) dy,\]

where \(L\) is the size of the lattice, \(S(f,x)\) is the power spectrum of energy fluctuations at site \(x\ ,\) and \(e^{\delta x}\) is the characteristic time for energy to propagate to site \(x\) from the origin of the lattice. Thus, in this model the \(1/f\) noise characterizing the entire lattice arises from a linear superposition of the (uncorrelated) power spectra of independent elements, which in turn are composed of exponentials with a range of parameter values that depend on their location in the lattice, just as in the earlier models described above.

Heart beats and sway

One bridge between nonliving and living systems is the presence of pink noise in both. Several examples of pink noise in living systems are presented by Musha (1981). The time series made up of intervals between successive R peaks (reflecting muscle contractions) of the electrocardiogram of the human heart has an approximately \(1/f\) power spectrum; the slope of the log-log plot in Figure 2C is about 1.075 over several decades of frequency. Musha (1981) also reported a study of the postural sway of a person standing on a platform. The spectral power density was an approximately \(1/f\) for frequencies below 1 Hz, and \(\alpha\) was slightly greater than 1 for higher frequencies. Musha thought, correctly, that the \(1/f\) power spectrum is related to the mechanism of posture control (Lauk, et al. 1998).

1/f noise in the brain

Among living systems producing \(1/f\) noise is the brain. In some studies, “channel noise” in neurons , which is thought to arise from the random opening and closing of ion channels in the cell membrane, is seen to be \(1/f\ .\) One possible mechanism for this is a model of the vibration of hydrocarbon chains in cell membrane lipids that affects the conductance of potassium ions (Lundström and McQueen 1974). Musha (1981) showed that the series of fluctuations in the time density (the inverse of transmission speed) of action potentials traveling down the squid giant axon have an approximately \(1/f\) power spectrum below about 10 Hz (Figure 2D). Novikov et al. (1997) found that the activity of ensembles of neurons in the brain, recorded from relaxed human subjects by the magnetoencephalogram, shows a \(1/f\) power spectrum (Figure 2E). The log-log spectrum in Figure 2E has a slope of -1.03 over the range 0.4 to 40 Hz. Electroencephalogram recordings also display \(1/f\) noise in the brain. Ward (2002) described an unpublished study by McDonald and Ward (1998) in which a series of large event-related potentials were evoked by a 50-ms 1000-Hz tone burst at 80 dB from a human subject seated in a very quiet (35 dB background noise) sound-attenuating room. The power spectrum of the series obtained by sampling the EEG record at a time point in the pre-stimulus period, and that for obtained by sampling the EEG record at the peak of the earliest negative-going event-related potential component, were both approximately \(1/f\) (Figure 2F).

Similarly, Linkenkaer-Hansen at el. (2001) showed that both MEG and EEG recordings of spontaneous neural activity in humans displayed \(1/f\)-like power spectra in the \(\alpha\ ,\) \(\mu\ ,\) and \(\beta \) frequency ranges, although the exponents tended to be somewhat less than 1 and differed across the frequency ranges. They suggested that the power-law scaling they observed arose from self-organized criticality occurring within neural networks in the brain. It is possible, however, this inference is not necessarily warranted. One recent study (Bedard et al., 2006) showed that the \(1/f\) scaling of brain local field potentials does not seem to be associated with critical states in the simultaneously-recorded neuronal activities, but rather arises from filtering of the neural signal through the cortical tissue.

Stock markets and the GNP

\(1/f\) noise in economic data is usually studied as long range dependence or long memory. It has been shown repeatedly that the autocorrelation functions of economic time series, such as series of stock prices over days, weeks or months, or the GNPs of various countries over years, do not decay exponentially as they would if the process generating the series were a simple autoregressive (AR) process (see Baillie, 1996, for a review of long memory in economic data). Instead, the autocorrelation functions of many economic time series reach a non-zero asymptote and remain there for the entire series, albeit often at a low value, indicating that economic events some distance in the past continue to have an influence on current prices or production. Such long memory processes are usually modeled in economics as fractionally-integrated white noise processes combined with AR (parameter \(p\)) and moving average (MA, parameter \(q\)) processes to form ARFIMA(\(p,d,q\)) models, introduced by Hosking (1981) as ARIMA(\(p,d,q\)) models (with integrating parameter \(d\) allowed to be a real number instead of an integer, as it is in the Box-Jenkins approach), and later discussed under "Identifying \(1/f\) noise." But this is not the only possible approach. For example, Granger (1980) was the first to show that long memory can result from an aggregation of an infinite number of AR processes with random parameters, and Granger and Ding (1996) showed that this can happen for more realistic aggregations of a finite number of AR processes. In the section "Mathematics of \(1/f\) noise," below, there is more information on the aggregation of AR processes.

A multiplicative point process model (see section "Mathematics of 1/f noise") of trading activity, including generalizations and extensions of the model that explain long-range memory volatility, has been proposed by Gontis and Kaulakys (2004, 2007).

Music, time perception, memory, and reaction times

Voss and Clark (1975) showed that the power spectrum for intensity fluctuations in a recording of Bach's Brandenburg Concerto No. 1 (Figure 2G), and in many other instances of recorded music and human voices heard over the radio, was approximately \(1/f\) over about 3 decades of frequency. Musha (1981) also summarized several of his own studies which established that \(1/f\) noise in the spatial frequency domain characterizes some cartoons and paintings, and that transcutaneous pain reduction is more effective when applied according to a \(1/f\) sequence. Gilden, Thornton, and Mallon (1995) reported approximately \(1/f\) power spectra for time series composed of the errors made by human subjects in estimating various time intervals (Figure 2H). Similar power spectra also were found for human reaction times in a memory task (Clayton & Frey, 1995), in many other traditional tasks used in experimental psychology (Gilden, 1997), in coordination of finger-tapping with a metronome (Chen, Ding & Kelso, 1997), and even in simple detection responses (Van Orden, Holden and Turvey, 2005). In psychological data, fluctuations in the dependent variable that cannot be accounted for by the changes in the independent variable(s) are called “error” in the sense of the residuals from a linear regression. Such error is usually considered to arise from a white noise process. Gilden et al (1995) modeled time estimation errors by a linear combination of \(1/f\) noise from an internal clock and white noise from the motor process producing a key press. Gilden (1997) extended this model to other reaction times, and in so doing, partitioned the unexplained dependent variable fluctuations, or error, into two components\[1/f\] and white. He found that a substantial proportion of residual error is \(1/f\ .\) Ward and Richard (reported in Ward, 2002) modeled the \(1/f\) noise component by an aggregation of three AR(1) processes with different parameters, and showed that a manipulation of decision load in a classification task, which changed the slope of the power spectrum, affected the process with the mid-range parameter most.

Mathematics of 1/f noise

Although \(1/f\) noise appears in many natural systems, as summarized above, and has been intensively studied for decades with many attempts to describe the phenomenon mathematically, researchers have not yet been able to agree on a unified explanation. Thus, there exist at present several formulations of systems that give rise to \(S(f)=constant/f^{\alpha}\ .\) In what follows we describe a few of these, with the intention of illuminating their commonalities.

A shot noise process

First, let \(t_k\) be a Poisson point process. A shot noise process is obtained by attaching to each \(t_k\) a relaxation function,

\[N(t)=N_0 e^{-\lambda t}, t \ge 0,\]

and summing on \(k\ .\) The Fourier transform of the shot noise process is

\[F(f)=\int_{-\infty}^{\infty}\sum_k N(t-t_k) e^{-ift} dt\]

and the power spectrum is

\[S(f)=\lim_{T \to \infty} \frac{1}{T} \langle \mid F(f) \mid ^2 \rangle=\frac{N_0^2 n}{\lambda^2+f^2},\]

where \(n\) is the average rate at which \(t_k\) occur, and \(T\) is the interval over which the process is observed. For an aggregation of shot noise processes with \(\lambda\) uniformly distributed on [\(\lambda_1, \lambda_2\)], the power spectrum is

\[ S(f) =\left \{ \begin{array}{ll} N_0^2 n & \textrm{if} \ 0 \ll f \ll \lambda_1 \ll \lambda_2\\ \frac{N_0^2n\pi}{2f(\lambda_2-\lambda_1)}\cdot\frac{1}{f} & \textrm{if} \ \lambda_1 \ll f \ll \lambda_2 \\ N_0^2n\cdot\frac{1}{f^2} & \textrm{if} \ 0\ll \lambda_1 \ll \lambda_2 \ll f \end{array} \right. \ .\]

(Milotti 2002). Note that the power spectrum of the single shot noise process, \(S(f)=\frac{N_0^2 n}{\lambda^2+f^2}\ ,\) is a Lorentzian function, identical to the one derived by Schottky. A Lorentzian function is a single-peaked function that decays gradually on each side of the peak; it has the general form

\[G(f)=\frac{K}{C+f^2},\]

where \(C\) and \(K\) are factors that depend on the particular system modeled, and \(f\) is frequency.

If the impulse response function is a power law, \(N_0x^{-\beta}\ ,\) the process is called fractal shot noise, and the power spectrum is of the form

\[S(f)\approx\frac{k}{f^{2(1-\beta)}}.\]

When \(\beta = 1/2\ ,\) we obtain \(S(f)\approx1/f\) (Lowen & Teich 2005).

A clustering Poisson point process

Another example based on a Poisson point process is the clustering Poisson. To each Poisson point, \(t_k\ ,\) is attached an additional set of points, called a cluster, that occur after it; clusters can overlap each other. The number of points in each cluster, \(m\ ,\) is a random variable whose distribution, \(p_m\ ,\) is concentrated on a finite set of integers. The points in the cluster are spaced at i.i.d. intervals with an arbitrary interpoint distribution. The power spectral density turns out to be a sum of Lorentzian-like functions. When \(p_m\) is proportional to \(1/m^2\) we obtain \(S(f)\propto1/f\) (Gruneis & Musha 1986). A slightly different formulation is a gating process in which clusters do not overlap. Here, a Poisson point process is multiplied by a gating process that is 1 on a random interval and then 0 on a random interval and so on. To obtain a \(1/f\) noise let the intervals of 1 be exponentially distributed and the intervals of 0 be geometrically distributed, or vice versa. Then roughly the same computations as just summarized yield the \(1/f\) approximation (Gruneis 2001). Notice that for the shot noise processes, the cluster and gating processes, and the AR(1) aggregation, the power spectral density computation yields a sum of Lorentzian or Lorentzian-like functions.

Recurrence models

Recently, stochastic point process models of \(1/f\) noise have been proposed (Kaulakys and Meškauskas 1998) and generalized (Kaulakys, et al. 2005). In these models the signal consists of pulses or events

\[x(t)=a\sum_k\delta(t-t_k).\]

Here \(\delta(t)\) is the Dirac delta function, \(\{t_k\}\) is a set of the occurrence times at which the particles or pulses cross the section of observation, and \(a\) is the contribution to the signal of one pulse or particle. The interpulse, interevent, interarrival, recurrence or waiting times \(\tau_k=t_{k+1}-t_k\) of the signal are described by the general Langevin equation with multiplicative noise, which is also stochastically diffuse in some interval, resulting in the power-law distribution.

Another recurrence time process generating a power-law probability distribution is a multiplicative stochastic process

\[\tau _{k+1}=\tau _{k}+\gamma \tau _{k}^{2\mu -1}+\sigma \tau _{k}^{\mu}\varepsilon _{k},\]

where the \(\epsilon_k\) are i.i.d. Gaussian noise, \(\gamma\) is very small and \(\sigma\ ,\) the standard deviation of the noise, is also small, while \(\mu\) represents the degree of multiplicativity of the process. A particular form of the model is the AR(1) process

\[(\tau_k - \bar{\tau}) = (1-\gamma)(\tau_k - \bar{\tau})+\sigma\epsilon_k,\]

where \(\bar{\tau}\) is the mean of the inter-event intervals.

Notice that the power spectrum of this AR(1) time series process, composed of successive values of \((\tau_k - \bar{\tau}),\) is proportional to \(1/f^2\) on a long interval when \(\gamma\) is small, and thus this power spectrum not the same as that of the point process (whose points \(t_k\) generate the time series) on that interval.

In such point process models the intrinsic origin of \(1/f\) noise is in Brownian fluctuations of the mean inter-event time of the (Poisson-like) signal pulses, similar to Brownian fluctuations of signal amplitude that result in \(1/f^2\) noise. The random walk of the inter-event time on the time axis is a property of randomly perturbed or complex systems that display self-organization.

A stochastic differential equation model

Nonlinear stochastic differential equations

\[dx=\Gamma x^{2\eta-1}dt+x^{\eta}dW\]

for the signal \(x=1/\tau\) have been derived (Kaulakys et al. 2006). These models exhibit \(1/f^{\alpha}\) noise and a \(1/x^{\lambda}\) distribution of signal intensity when \(\eta>1\) for different values of \(\alpha\) and \(\lambda\ .\) Here

\[\eta=\frac{5}{2}-\mu, \quad \Gamma=1-\frac{\gamma}{\sigma^{2}}\]

and W is a Wiener process.

These models can generate a variety of monofractal and multifractal time series exhibiting spectra \(S(f)\propto1/f^\alpha\) with \(0.5\lesssim \alpha \lesssim 2\) and \(1/f\) noise with a very large Hooge parameter. They may be used as a theoretical framework for understanding huge fluctuations (Kruppa 2006), for the analysis of financial systems (Gontis and Kaulakys 2004, 2007), as well as for description of a large variety of observable statistics.

Reversible Markov chain models

A class of reversible Markov chains which have the \(1/f^{\alpha}\) property is given in Erland & Greenwood (2007). Sufficient conditions are found on the eigenvalues and the eigenfunctions of the Markov chain to generate \(1/f^{\alpha}\) fluctuations. A particular case is a collection of (discrete-time) AR(1) processes with different parameters, \(\theta_m\ :\)

\[X_t^m=\theta_mX_{t-1}^m+\sigma\epsilon_t,\]

where the \(\epsilon_t\) are independent and identically distributed (i.i.d.) Gaussian noise, with mean 0 and standard deviation \(\sigma\ ,\) and the \(\theta_m\) are widely distributed between 0 and 1. The autocorrelation function of \(X_t^m\) decays exponentially in time with rate \(\theta_m\ .\) The spectral densities of the time series generated by these processes taken one at a time are

\[ S_m(f) =\frac{\sigma^2}{(1-\theta_m)^2+2\theta_m(1-cos f)}\approx \left \{ \begin{array}{ll} \sigma^2/(1-\theta_m)^2 & \textrm{if} \ 0 \ll f \ll 1-\theta_m\\ \sigma^2/\theta_mf^2 & \textrm{if} \ 1-\theta_m \ll f \le1 \end{array} \right. \ .\]

Let \[Y_t=\frac{1}{m}\sum_{i-1}^{m} X_t^i,\]

and suppose the density of the coefficients, \(\theta\) is distributed as \((1-\theta)^{1-\alpha}\) for \(\theta_{min}<\theta<\theta_{max}\ .\) The power spectral density of \(Y_t\) for large \(m\ ,\) is approximately

\[ S(f) \propto \left \{ \begin{array}{ll} 1 & \textrm{if} \ 0 \ll f \ll 1-\theta_{max}\\ 1/f^\alpha & \textrm{if} \ 1-\theta_{max} \ll f \ll 1\\ 1/f^2 & \textrm{if} \ 1-\theta_{min} \ll f \ll 1 \end{array} \right. \ .\]

(Erland and Greenwood 2007). Notice also that \(S_m(f)\) is a Lorentzian-like function.

If the coefficients \(\theta_m\) in the AR(1) processes are uniformly distributed \((\alpha=1)\ ,\) one obtains a good approximation of \(1/f\) noise simply by averaging the individual series. This corresponds to the classical result that the power spectrum of a uniform mixture of exponentially decaying autocorrelation functions has a \(1/f\) form (Bell, 1960). Even the sum of as few as three AR(1) processes with widely distributed coefficients (e.g., 0.1, 0.5, 0.9) gives a reasonable approximation to a \(1/f\) power spectrum (Ward 2002).

Identifying 1/f noise

A useful approach to \(1/f\) noise would be to state a model, derive an expression for the power spectrum in the form \(1/f^\alpha\ ,\) collect or simulate data relevant to the model, and then estimate the parameter \(\alpha\) using linear regression or some similar method (Pilgram & Kaplan, 1998) from the log-transformed power spectrum of the data. More usually, physicists (such as Bell, 1960), biologists, engineers, psychologists, and so forth have noticed, from plotting the power spectrum on a log-log plot, that the noise produced in a particular experimental situation is \(1/f\)-like. They have then proceeded, often in the absence of a specific model, to estimate \(\alpha\) using linear fits. They also often test this estimate against a null hypothesis of \(\alpha= 0\) or \(\alpha= 1\ .\)

If one has a specific model in mind, such as one of those described in the previous section, the best procedure would be to estimate the parameters of that model from one’s data and determine whether those estimates imply \(1/f\)noise in the context of that model. This may be difficult to accomplish because of lack of an appropriate estimation scheme. Wagenmakers, Farrell and Ratcliff (2004) have suggested, in the context of assessing the presence of long range dependence and/or \(1/f\) noise in psychological data, that in the absence of a model one might use a scheme based on parameter estimation for a model in which the \(1/f\) property arises from fractionally differenced white noise (Beran, 1994). This is the ARFIMA model (Hosking, 1981), where AR stands for an auto-regressive process, MA stands for a moving average process, and FI stands for the fractionally integrated white noise process. In the scheme of Wagenmakers et al (2004), an ARMA (1,1) model is tested against the ARFIMA (1,\(d\ ,\)1) model, where \(d\) is the exponent of the fractional differencing operator. The test is done by a procedure based on the Akaike Information Criterion (AIC) (Akaike, 1974; Burnham, Anderson, 2002).

For simplicity, consider ARMA (1,1):

\[X_t=\theta_1 X_{t-1} + \epsilon_t+\phi_1 \epsilon_{t-1},\]

where \(0 < \theta_1 < 1\) is the AR parameter, \(0 < \phi_1 < 1\) is the MA parameter, and \(\epsilon_t\) is i.i.d. Gaussian noise. The fractional differencing operator is

\[\nabla^d = (1-B)^d = \sum_{k=0}^{\infty} \left( \begin{array}{ll} d \\ k \end{array} \right) (-B)^k,\]

where \(B\) is the backward shift operator defined on the sequence \(X_t\) as \(BX_t = X_{t-1}\ ,\) and \(d\) is the fractional differencing parameter. The ARFIMA (1,\(d\ ,\)1) model is constructed by first applying the fractional differencing operator to the noise, \(\epsilon_t\ ,\) and then using the resulting noise to construct an ARMA process. If \(d\) can take on only integer values we have an ARIMA model, which does not display \(1/f\) noise for small values of \(p\) and \(q\ .\) The ARFIMA process has a power spectrum

\[S(f)=\frac{k}{f^{2d}},\]

and if \(0 \le d \le 1/2\ ,\) then the process is said to be stationary with long range dependence. If \(d = 1/2\) then the power spectrum is exactly \(1/f\ .\) The procedure of Wagenmakers et al (2004) uses an algorithm of Doornik and Ooms (2003) to estimate \(\hat{d},\) \(\hat{\theta_1},\) and \(\hat{\phi_1}\) for a time series and, based on these estimates, tests the ARFIMA(1,\(d\ ,\)1) model against the ARMA(1,1) model. A preference for the ARFIMA model is said to indicate the presence of long range dependence, and \(1/f\) noise, in the time series. The exponent \(\alpha\) in \(S(f)=constant/f^\alpha\) is estimated as \(\hat{\alpha}=2\hat{d}\ .\)

Although appealing, the range of applicability of this procedure may be limited. It is not a model identification procedure but rather one for determining whether a given time series possesses a \(1/f^\alpha\) character. For example, when simulated data from an aggregation of three AR(1) processes with widely distributed parameter values, \(\theta_m\ ,\) were analyzed, the procedure preferred the ARFIMA (1,\(d\ ,\)1) model to the ARMA(1,1) in 96.7% of the realizations (Wagenmakers et al., 2004). Such a time series does have a \(1/f^\alpha\) character but was not generated by an ARFIMA(1,\(d\ ,\)1) process.

It appears that there is no statistical criterion for the presence of \(1/f^\alpha\) character in point process data. If the data were from a point process such as that of the Kaulakys & Meškauskas (1998) model in the "Mathematics of 1/f noise" section, the power spectra of the point process and of the AR(1) time series generated from it would not be the same. In that case, applying the Wagenmakers et al. (2004) procedure to the AR(1) time series, while it would correctly identify the AR(1) process producing the time series, would fail to identify the \(1/f^\alpha\) character in the associated point process.

There are still other approaches to characterization of time series data in ways relevant to \(1/f\) noise. These include rescaled range analysis (e.g., Chen, Ding & Kelso, 1997) and the Hurst exponent (e.g., Schroeder, 1991), which both involve estimating parameters that are related to the exponent of the power spectrum, \(\alpha\ .\) For example, for the Hurst exponent, \(H=(1+\alpha)/2\ .\) There are obvious advantages to analyzing one's time series data from several perspectives, but no canonical procedure as yet exists.

References

  • Akaike H. (1974) A new look at the statistical model identification. IEEE Transactions on Automatic Control, 19:716—723.
  • Baillie R.T. (1996) Long memory processes and fractional integration in econometrics. J. Economet., 73:5--59.
  • Bak P. Tang C. Wiesenfeld K. (1987) Self-organized criticality: An explanation of 1/f noise. Phys. Rev. Lett., 59:381—384.
  • Bedard C. Kroger H. Destexhe A. (2006) Does the 1/f frequency scaling of brain signals reflect self-organized critical states? Phys. Rev. Lett., 97: 118102.
  • Bell D.A. (1960) Electrical Noise. London: Van Nostrand.
  • Beran J. (1994). Statistics for long-memory processes. New York: Chapman & Hall.
  • Bernamont J. (1937) Fluctuations de potential aux bornes d'un conducteur metallique de faible volume parcouru par un courant. Ann. Phys. (Leipzig), 7:71--140.
  • Burnham K. P. Anderson D. R. (2002). Model selection and multimodel inference: A practical information-theoretic approach. New York: Springer-Verlag.
  • Chen, Y. Ding, M. Kelso, J.A.S. (1997) Long memory processes (\(1/f^\alpha\) type) in human coordination. Phys. Rev. Lett., 79:4501-4504.
  • Clayton K. Frey B. (1997) Studies of mental “noise.” Nonlin. Dynam. Psychol. Life Sci., 1:173—180.
  • Caloyannides M.A. (1974) Microcycle spectral estimates of 1/f noise in semiconductors. J. Appl. Phys., 45:307--316.
  • De Los Rios P. Zhang-Y-C. (1999) Universal 1/f noise from dissipative self-organized criticality models. Phys. Rev. Lett., 82:472--475.
  • Doornik J.A. and Ooms M. (2003) Computational Aspects of Maximum Likelihood Estimation of Autoregressive Fractionally Integrated Moving Average Models. Computa. Stat. Data Anal., 42:333-348.
  • Dutta P. Horn P. M. (1981) Low-frequency fluctuations in solids: 1/f noise. Rev. Mod. Phys., 53:497--516.
  • Erland S. Greenwood P.E. (2007) Constructing \(1/f^\alpha\) noise from reversible Markov chains. Phys. Rev. E, 76:
  • Gilden D.L. (1997) Fluctuations in the time required for elementary decisions. Psychol. Sci., 8:296--301.
  • Gilden D.L. Thornton T. Mallon M.W. (1995) 1/f noise in human cognition. Science, 267:1837--1839.
  • Gontis V. Kaulakys B. (2004) Multiplicative point process as a model of trading activity. Physica A, 343:505-514; cond-mat/0303089.
  • Gontis V. Kaulakys B. (2007) Modeling long-range memory trading activity by stochastic differential equations. Physica A, 382(1):114-120; physics/0608036.
  • Granger C.W.J. (1980) Long memory relationships and the aggregation of dynamic models. J. Economet., 14:227--238
  • Granger C.W.J. Ding Z. (1996) Varieties of long memory models. J. Economet., 73:61--77
  • Gruneis F. (2001) 1/f noise, intermittency and clustering Poisson process. Fluct. Noise Lett., 1:R119--R130.
  • Gruneis F. Musha T. (1986) Clustering Poisson process and 1/f noise. Jap. J. Applied Phys., 25:1504--1509.
  • Hooge F. N. (1976) 1/f noises. Physica A&C, 83: 14-23.
  • Hooge F. N., Kleinpenning T. G. M., Vandamme L. K. J. (1981) Experimental studies on 1/f noise. Rep. Prog. Phys., 44:479-- 532.
  • Hooge F. N. (1994) 1/f noise sources. IEEE Trans. Elecron. Devic., 41: 1926-1935.
  • Hosking J.R.M. (1981) Fractional differencing. Biometrika, 68:165-176.
  • Johnson J.B. (1925) The Schottky effect in low frequency circuits. Phys. Rev., 26:71--85.
  • Kaulakys B. Meškauskas T. (1998) Modeling 1/f noise. Phys. Rev. E, 58:7013--7019.
  • Kaulakys B. Gontis V. Alaburda M. (2005) Point process model of 1/f noise vs a sum of Lorentzians. Phys. Rev. E, 71:051105; cond-mat/0504025.
  • Kaulakys B. Ruseckas J. Gontis V. Alaburda M. (2006) Nonlinear stochastic models of 1/f noise and power-law distributions. Physica A, 365:217-221; cond-mat/0509626.
  • Kogan S. M. (1985) Low-frequency current 1/f-noise in solids. Uspekhi Fizicheskikh Nauk, 145:285-328; Sov. Phys. Usp. 28:170.
  • Lauk M. Chow C.C. Pavlik A.E. Collins J.J. (1998) Human balance out of equilibrium: Nonequilibrium statistical mechanics in posture control. Phys. Rev. Lett., 80:413--416.
  • Linkenkaer-Hansen, K. Nikouline, V.V. Palva, J.M. Ilmoniemi, R.J. (2001). Long-range temporal correlations and scaling behavior in human brain oscillations. J. Neurosci., 21:1370-1377.
  • Lowen, B. & Teich, M.C. 2005 Fractal-based Point Processes. Hoboken, NJ: Wiley.
  • Lundström I. McQueen D. (1974) A proposed 1/f noise mechanism in nerve cell membranes. J. Theoret. Biol., 45:405--409.
  • Mandelbrot B. (1998) Multifractals and 1/f noise: Wild self-affinity in physics. New York: Springer.
  • Maslov S. Tang C. Zhang Y.C. (1999) 1/f noise in Bak-Tang-Wiesenfeld models on narrow strips. Phys. Rev. Lett., 83:2449--2452.
  • McWhorter A. L.(1957) 1/f noise and germanium surface properties. In Semiconductor Surface Physics, edited by R. H. Kingston, University of Pennsylvania, Philadelphia, pp. 207-- 228.
  • Milotti, E. (2002) A pedagogical review of 1/f noise. Arxiv preprint physics/0204033, 2002 - arxiv.org.
  • Musha T. (1981) 1/f fluctuations in biological systems. In P.H.E. Meijer, R.D. Mountain & R.J. Soulen, Jr. (Eds.), Sixth International Conference on Noise in Physical Systems (pp. 143-146). Washington, DC: U.S. Department of Commerce and National Bureau of Standards.
  • Novikov E. Novikov A. Shannahoff-Khalsa D. Schwartz B Wright J. (1997) Scale-similar activity in the brain. Phys. Rev. E, 56:R2387-R2389.
  • Pilgram, B. Kaplan, D.T. (1998) A comparison of estimators for 1/f noise. Physica D, 114:108-122.
  • Press W. H. 1978 Flicker noises in astronomy and elsewhere. Comments on Astrophysics, 7:103-119.
  • Schottky W. (1918) Über spontane Stromschwankungen in verschiedenen Elektrizitätsleitern. Ann. der Phys., 57:541-567.
  • Schottky W. (1926) Small-shot effect and flicker effect. Phys. Rev., 28:74--103.
  • Schroeder M. (1991) Fractals, Chaos, Power Laws: Minutes from an Infinite Paradise. New York: Freeman.
  • Van Orden G.J. Holden J.G. Turvey M.T. (2005) Human cognition and 1/f scaling. J. Exper. Psychol.: Gen., 132:331--350.
  • Van Vliet C. M. (1991) A survey of results and future prospects on quantum 1/f noise and 1/f noise in general. Solid-State Electron., 34: 1-- 21.
  • Vespignani A. Zapperi S. (1998) How self-organized criticality works. A unified mean-field picture. Phys. Rev. E, 57:6345--6362.
  • Voss R.F. Clarke J. (1975) 1/f noise in music and speech. Nature, 258:317--318.
  • Wagenmakers E-J. Farrell S. Ratcliff R. (2004) Estimation and interpretation of \(1/f^\alpha\) noise in human cognition. Psychonom. Bull. Rev., 11:579--615.
  • Weissman M. B. (1988) 1/f noise and other slow, nonexponential kinetics in condensed matter. Rev. Mod. Phys., 60:537--571.
  • Ward L.M. (2002) Dynamical Cognitive Science. Cambridge, MA: MIT Press.
  • West B. J. Shlesinger M.F. (1990) The noise in natural phenomena, American Scientist, 78:40-45.
  • Wong H. (2003) Low-frequency noise study in electron devices: review and update. Microelectron. Reliab., 43:585-- 599.
  • Zhigalskii G. P. (1997) 1/f noise and nonlinear effects in thin metal films. Uspekhi Fizicheskikh Nauk, 167:623-648; Phys. Usp., 40, 599.

Internal references

  • Jan A. Sanders (2006) Averaging. Scholarpedia, 1(11):1760.
  • Valentino Braitenberg (2007) Brain. Scholarpedia, 2(11):2918.
  • Eugene M. Izhikevich (2006) Bursting. Scholarpedia, 1(3):1300.
  • Gregoire Nicolis and Catherine Rouvas-Nicolis (2007) Complex systems. Scholarpedia, 2(11):1473.
  • James Meiss (2007) Dynamical systems. Scholarpedia, 2(2):1629.
  • Paul L. Nunez and Ramesh Srinivasan (2007) Electroencephalogram. Scholarpedia, 2(2):1348.
  • Mark Aronoff (2007) Language. Scholarpedia, 2(5):3175.
  • Jeff Moehlis, Kresimir Josic, Eric T. Shea-Brown (2006) Periodic orbit. Scholarpedia, 1(7):1358.
  • Philip Holmes and Eric T. Shea-Brown (2006) Stability. Scholarpedia, 1(10):1838.

External Links

  • 1 / f noise, Wikipedia [1]
  • Pink noise, Wikipedia [2]
  • A Bibliography on 1/f Noise [3]
  • Fluctuation and Noise Letters (FNL) [4]
  • DSP generation of Pink (1/f) Noise [5]
  • Association for Science, Art and Technology of Fluctuations: ASATeF [6]
  • The Allan Variance [7]
  • Reliability & noise control lab [8]
  • Real-time Wavelet-transform spectrum analyzer for the investigation of 1/f α noise [9]
  • GENERAL QUANTUM 1/f NOISE BIBLIOGRAPHY, P.H. Handel [10]
  • Proceedings of the International Conference on Noise in Physical Systems and 1/f Fluctuations [11]

Recommended Reading

  • Beran J. (1994). Statistics for long-memory processes. New York: Chapman & Hall.
  • Hosking J.R.M. (1981) Fractional differencing. Biometrika, 68:165-176.
  • Mandelbrot B. (1998) Multifractals and 1/f noise: Wild self-affinity in physics. New York: Springer.
  • Milotti, E. (2002) A pedagogical review of 1/f noise. Arxiv preprint physics/0204033, 2002 - arxiv.org
  • Schroeder M. (1991) Fractals, Chaos, Power Laws: Minutes from an Infinite Paradise. New York: Freeman
  • Ward L.M. (2002) Dynamical Cognitive Science. Cambridge, MA: MIT Press


See Also

Chaos, Fractals, Noise, Sandpiles, Self-organized Criticality

Personal tools
Namespaces

Variants
Actions
Navigation
Focal areas
Activity
Tools