|Harald Atmanspacher and Peter beim Graben (2009), Scholarpedia, 4(3):7997.||doi:10.4249/scholarpedia.7997||revision #184581 [link to/cite this article]|
Contextual emergence characterizes a specific kind of relationship between different domains of scientific descriptions of particular phenomena. Although these domains are not ordered strictly hierarchically, one often speaks of lower and higher levels of description, where lower levels are considered as more fundamental in a certain sense. As a rule, phenomena at higher levels of description are more complex than phenomena at lower levels. This increasing complexity depends on contingent conditions, so-called contexts, that must be taken into account for an appropriate description.
Moving up or down in the hierarchy of descriptions also decreases or increases the amount of symmetries relevant at the respective level. A (hypothetical) description at a most fundamental level would have no broken symmetry, meaning that such a description is invariant under all conceivable transformations. This would amount to a description completely free of contexts: everything is described by one (set of) fundamental law(s). Indeed, this is sometimes called (the dream of) a "theory of everything", but it is equally correct to call it – literally – a "theory of nothing". The consequence of complete symmetry is that there are no distinguishable phenomena. Broken symmetries provide room for contexts and, thus, "create" phenomena.
Contextual emergence utilizes lower-level features as necessary (but not sufficient) conditions for the description of higher-level features. As will become clear below, it can be viably combined with the idea of multiple realization, a key issue in supervenience (Kim 1992, 1993), which poses sufficient but not necessary conditions at the lower level. Both contextual emergence and supervenience are interlevel relations more specific than a patchwork scenario as in radical emergence and more flexible than a radical reduction where everything is already contained at a lower (or lowest) level.
Contextual emergence is intended as a structural relation between different levels of description. As such, it belongs to the class of synchronic types of emergence (Stephan 1999). It does not address questions of diachronic emergence, referring to how new qualities arise dynamically, as a function of time. Contextual emergence also differs from British emergentism from Mill to Broad. An informative discussion of various types of emergence versus reductive interlevel relations is due to Beckermann et al. (1992), see also Gillett (2002).
The conceptual scheme
The basic idea of contextual emergence is to establish a well-defined interlevel relation between a lower level \(L\) and a higher level \(H\) of a system. This is done by a two-step procedure that leads in a systematic and formal way (1) from an individual description \(L_i\) to a statistical description \(L_s\) and (2) from \(L_s\) to an individual description \(H_i\). This scheme can in principle be iterated across any connected set of descriptions, so that it is applicable to any case that can be formulated precisely enough to be a sensible subject of a scientific investigation.
The essential goal of step (1) is the identification of equivalence classes of individual states that are indistinguishable with respect to a particular ensemble property. This step implements the multiple realizability of statistical states in \(L_s\) (which will be the basis for individual states in \(H_i\)) by individual states in \(L_i\ .\) The equivalence classes at \(L\) can be regarded as cells of a partition. Each cell is the support of a (probability) distribution representing a statistical state, encoding limited knowledge about individual states.
The essential goal of step (2) is the assignment of individual states at level \(H\) to coextensional statistical states at level \(L\ .\) This is impossible without additional information about the desired level-\(H\) description. In other words, it requires the choice of a context setting the framework for the set of observables (properties) at level \(H\) that is to be constructed from level \(L\ .\) The chosen context provides conditions that can be implemented as stability criteria at level \(L\ .\) It is crucial that such stability conditions cannot be specified without knowledge about the context at level \(H\ .\) In this sense the context yields a top-down constraint, or downward confinement (sometimes misleadingly called downward causation).
The notion of stability induced by context is of paramount significance for contextual emergence. Roughly speaking, stability refers to the fact that some system is robust under (small) perturbations - for instance, if (small) perturbations of a homeostatic or equilibrium state are damped out by the dynamics, so that the initial state is (asymptotically) retained. The more complicated notion of a stable partition of a state space is based on the idea of coarse-grained states, i.e. cells of a partition whose boundaries are (approximately) maintained under the dynamics.
Stability criteria guarantee that the statistical states of \(L_s\) are based on a robust partition so that the emergent observables in \(H_i\) are well-defined. (For instance, if a partition is not stable under the dynamics of the system at \(L_i\ ,\) the assignment of states in \(H_i\) will change over time and, thus, will be ill-defined.) Implementing a contingent context of \(H_i\) as a stability criterion in \(L_i\) yields a proper partitioning for \(L_s\ .\) In this way, the lower-level state space is endowed with a new, contextual topology (see Atmanspacher (2007) and Atmanspacher and Bishop (2007) for more details).
From a slightly different perspective, the context selected at level \(H\) decides which details in \(L_i\) are relevant and which are irrelevant for individual states in \(H_i\ .\) Differences among all those individual states at \(L_i\) that fall into the same equivalence class at \(L_s\) are irrelevant for the chosen context. In this sense, the stability condition determining the contextual partition at \(L_s\) is also a relevance condition.
The interplay of context and stability across levels of description is the core of contextual emergence. Its proper implementation requires an appropriate definition of individual and statistical states at these levels. This means in particular that it would not be possible to construct emergent observables in \(H_i\) from \(L_i\) directly, without the intermediate step to \(L_s\ .\) And it would be equally impossible to construct these emergent observables without the downward confinement arising from higher-level contextual constraints.
In this spirit, bottom-up and top-down strategies are interlocked with one another in such a way that the construction of contextually emergent observables is self-consistent. Higher-level contexts are required to implement lower-level stability conditions leading to proper lower-level partitions, which in turn are needed to define those lower-level statistical states that are co-extensional (not necessarily identical!) with higher-level individual states and their associated observables.
Example: From mechanics to thermodynamics
As an example, consider the transition from classical point mechanics over statistical mechanics to thermodynamics (Bishop and Atmanspacher 2006). Step (1) in the discussion above is here the step from point mechanics to statistical mechanics, essentially based on the formation of an ensemble distribution. Particular properties of a many-particle system are defined in terms of a statistical ensemble description (e.g., as moments of a many-particle distribution function) which refers to the statistical state of an ensemble (\(L_s\)) rather than the individual states of single particles (\(L_i\)).
An example for an observable associated with the statistical state of a many-particle system is its mean kinetic energy, which can be calculated from the the Maxwell-Boltzmann distribution of the momenta of all N particles. The expectation value of kinetic energy is defined as the limit of its mean value for infinite N.
Step (2) is the step from statistical mechanics to thermodynamics. Concerning observables, this is the step from the expectation value of a momentum distribution of a particle ensemble (\(L_s\)) to the temperature of the system as a whole (\(H_i\)). In many standard philosophical discussions this step is mischaracterized by the false claim that the thermodynamic temperature of a gas is identical with the mean kinetic energy of the molecules which constitute the gas. A proper discussion of the details was not available for a long time and has been achieved by Haag et al. (1974) and Takesaki (1970) in the framework of quantum field theory.
The main conceptual point in step (2) is that thermodynamic observables such as temperature presume thermodynamic equilibrium as a crucial assumption serving as a contextual condition. It is formulated in the zeroth law of thermodynamics and not available at the level of statistical mechanics. The very concept of temperature is thus foreign to statistical mechanics and pertains to the level of thermodynamics alone. (Needless to say, there are more thermodynamic observables in addition to temperature. Note that also a feature so fundamental as irreversibility in thermodynamics depends crucially on the context of thermal equilibrium.)
The context of thermal equilibrium (\(H_i\)) can be recast in terms of a class of distinguished statistical states (\(L_s\)), the so-called Kubo-Martin-Schwinger (KMS) states. These states are defined by the KMS condition which characterizes the (structural) stability of a KMS state against local perturbations. (More precisely, this includes stationarity, ergodicity, and mixing; compare Atmanspacher and beim Graben 2007). Hence, the KMS condition implements the zeroth law of thermodynamics as a stability criterion at the level of statistical mechanics. (The second law of thermodynamics expresses this stability in terms of a maximization of entropy for thermal equilibrium states. Equivalently, the free energy of the system is minimal in thermal equilibrium.)
Statistical KMS states induce a contextual topology in the state space of statistical mechanics (\(L_s\)) which is basically a coarse-grained version of the topology of \(L_i\ .\) This means nothing else than a partitioning of the state space into cells, leading to statistical states (\(L_s\)) that represent equivalence classes of individual states (\(L_i\)). They form ensembles of states that are indistinguishable with respect to their mean kinetic energy and can be assigned the same temperature (\(H_i\)). Differences between individual states at \(L_i\) falling into the same equivalence class at \(L_s\) are irrelevant with respect to a particular temperature at \(H_i\ .\)
While step (1) formulates statistical states from individual states at the mechanical level of description, step (2) provides individual thermal states from statistical mechanical states. Along with this step goes a definition of new, emergent thermal observables that are coextensive, but not identical with mechanical observables.. All this is guided by and impossible without the explicit use of the context of thermal equilibrium.
The example of the relation between mechanics and thermodynamics is particularly valuable for the discussion of contextual emergence because it illustrates the two essential construction steps in great detail. There are other examples in physics and chemistry which can be discussed in terms of contextual emergence: emergence of geometric optics from electrodynamics (Primas 1998), emergence of electrical engineering concepts from electrodynamics (Primas 1998), emergence of chirality as a classical observable from quantum mechanics (Bishop 2005, Bishop and Atmanspacher 2006), emergence of diffusion and friction of a quantum particle in a thermal medium (de Roeck and Fröhlich 2011, Fröhlich et al. 2011), emergence of hydrodynamic properties from many-particle theory (Bishop 2008). More examples from the sciences can be found in the readable monograph by Chibbaro et al. (2014).
Applications in cognitive neuroscience
If descriptions at \(L\) and \(H\) are well established, as it is the case in the preceding example, formally precise interlevel relations can be set up fairly straightforwardly. The situation becomes more challenging, though, when no such established descriptions are available, e.g. in cognitive neuroscience or consciousness studies, where relations between neural and mental descriptions are considered. Even there, contextual emergence has been proven viable for the construction of emergent mental states (e.g., the identification of neural correlates of conscious states). That brain activity provides necessary but not sufficient conditions for mental states, which is a key feature of contextual emergence, becomes increasingly clear even among practicing neuroscientists, see for instance the article by Frith (2011).
A basic element of theoretical and computational neuroscience are the Hodgkin-Huxley equations for the generation and propagation of action potentials (Hodgkin and Huxley 1952). The Hodgkin-Huxley equations form a system of four ordinary nonlinear differential equations: one electric conductance equation for transmembrane currents, and three master equations describing the opening kinetics of sodium and potassium ion channels. At a higher-level description of ion channel functioning, these equations characterize a deterministic dynamical system. However, at a lower-level description, the presence of master equations within the Hodgkin-Huxley system indicates a stochastic approach in terms of transition probabilities of Markov processes.
A closer inspection of the Hodgkin-Huxley equations (beim Graben 2016) reveals that the dynamics of neuronal action potentials is actually contextually emergent over (at least) three levels of description. At the first and lowest level, ion channels must be treated as macro-molecular quantum objects that are governed by a many-particle Schrödinger equation. This Schrödinger equation describes a highly entangled state of electrons and atomic nuclei as a whole, which does not allow an interpretation in terms of molecular structures such as an ion channel with a pore that is either closed or open. The molecular structure of an ion channel is contextually emergent through the Born-Oppenheimer approximation (cf. Primas 1998, Bishop and Atmanspacher 2006), separating electronic and nucleonic wave functions. After that separation, the electronic quantum dynamics becomes constrained to a (relatively) rigid nucleonic frame that now possesses a classical spatial structure.
At a second level, the fluctuations of the spatial structure of an ion channel must be treated as a stochastic process. Under the respective stability conditions for such processes (stationarity, ergodicity, mixing; compare Atmanspacher and beim Graben 2007) a continuous master equation for the molecular configurations can be derived (van Kampen 1992). Finally, at the third level, a contextual coarse-graining of configuration space into four closed and one open state (here for the potassium channel), yields the master equations of the Hodgkin-Huxley system as a contextually emergent description.
Macrostates in neural systems
Contextual emergence addresses both the construction of a partition at a lower-level description and the application of a higher-level context to do this in a way adapted to a specific higher-level description. Two alternative strategies have been proposed to contruct \(H_i\)-states ("neural macrostates") from \(L_i\)-states ("neural microstates") previously: one by Amari and collaborators and another one by Crutchfield and collaborators.
Amari and colleagues (Amari 1974, Amari et al. 1977) proposed to identify neural macrostates based on two criteria: (i) the structural stability of microstates as a necessary lower-level condition, and (ii) the decorrelation of microstates as a sufficient higher-level condition. The required macrostate criteria, however, do not exploit the dynamics of the system in the direct way which a Markov partition allows. A detailed discussion of contextual emergence in Amari's approach is due to beim Graben et al. (2009).
Mental states from neurodynamics
For the contextual emergence of mental states from neural states, the first desideratum is the specification of proper levels \(L\) and \(H\ .\) With respect to \(L\ ,\) one needs to specify whether states of neurons, of neural assemblies or of the brain as a whole are to be considered; and with respect to \(H\) a class of mental states reflecting the situation under study needs to be defined. In a purely theoretical approach, this can be tedious, but in empirical investigations the experimental setup can often be used for this purpose. For instance, experimental protocols include a task for subjects that defines possible mental states, and they include procedures to record brain states.
The following discussion will first address a general theoretical scenario (developed by Atmanspacher and beim Graben 2007) and then a concrete experimental example (worked out by Allefeld et al. 2009). Both are based on the so-called state space approach to mental and neural systems, see Fell (2004) for a brief introduction.
The first step is to find a proper assignment of \(L_i\) and \(L_s\) at the neural level. A good candidate for \(L_i\) are the properties of individual neurons. Then the first task is to construct \(L_s\) in such a way that statistical states are based on equivalence classes of those individual states whose differences are irrelevant with respect to a given mental state at level \(H\ .\) This reflects that a neural correlate of a conscious mental state can be multiply realized by "minimally sufficient neural subsystems correlated with states of consciousness" (Chalmers 2000).
In order to identify such a subsystem, we need to select a context at the level of mental states. As one among many possibilities, one may use the concept of "phenomenal families" (Chalmers 2000) for this purpose. A phenomenal family is a set of mutually exclusive phenomenal (mental) states that jointly partition the space of mental states. Starting with something like creature consciousness, that is being conscious versus being not conscious, one can define increasingly refined levels of phenomenal states of background consciousness (awake, dreaming, sleep, anesthesia, ...), wake consciousness (perceptual, cognitive, affective, ...), perceptual consciousness (visual, auditory, tactile, ...), visual consciousness (color, form, location, ...), and so on.
Selecting one of these levels provides a context which can then be implemented as a stability criterion at \(L_s\ .\) In cases like the neural system, where complicated dynamics far from thermal equilibrium are involved, a powerful method to do so uses the neurodynamics itself to find proper statistical states. The essential point is to identify a partition of the neural state space whose cells are robust under the dynamics. This guarantees that individual mental states \(H_i\ ,\) defined on the basis of statistical neural states \(L_s\ ,\) remain well-defined as the system develops in time. The reason is that differences between individual neural states \(L_i\) belonging to the same statistical state \(L_s\) remain irrelevant as the system develops in time.
The construction of statistical neural states is strikingly analogous to what leads Butterfield (2012) to the notion of meshing dynamics. In his terminology, \(L\)-dynamics and \(H\)-dynamics mesh if coarse graining and time evolution commute. From the perspective of contextual emergence, meshing is guaranteed by the stability criterion induced by the higher-level context. In this picture, meshing translates into the topological equivalence of the two dynamics.
For multiple fixed points, their basins of attraction represent proper cells, while chaotic attractors need to be coarse-grained by so-called generating partitions. From experimental data, both can be numerically determined by partitions leading to Markov chains. These partitions yield a rigorous theoretical constraint for the proper definition of stable mental states. The formal tools for the mathematical procedure derive from the fields of ergodic theory (Cornfeld et al. 1982) and symbolic dynamics (Marcus and Lind 1995), and are discussed in some detail in Atmanspacher and beim Graben (2007) and Allefeld et al. (2009).
A pertinent example for the application of contextual emergence to experimental data is the relation between mental states and EEG dynamics. In a recent study, Allefeld et al. (2009) tested the method using data from the EEG of subjects with sporadic epileptic seizures. This means that the neural level is characterized by brain states recorded via EEG, while the context of normal and epileptic mental states essentially requires a bipartition of that neural state space.
The data analytic procedure rests on ideas by Gaveau and Schulman (2005), Froyland (2005), and Deuflhard and Weber (2005). It starts with a (for instance) 20-channel EEG recording, giving rise to a state space of dimension 20, which can be reduced to a lower number by restricting to principal components (PC). On the resulting low-dimensional state space, a homogeneous grid of cells is imposed in order to set up a Markov transition matrix T reflecting the EEG dynamics on a fine-grained auxiliary partition.
The eigenvalues of T express relaxation time scales for the dynamics which can be ordered by size. Gaps between successive relaxation times indicate groupings referring to mental states defined by partitions of neural states of increasing refinement. The first group is often sufficient for the distinction of "target" mental states.
The eigenvectors corresponding to the eigenvalues of T span an eigenvector space, in which the measured PC-compactified states form a simplex. For instance, three leading eigenvalues allow a representation of neural states in a two-dimensional eigenvector space which yields a 2-simpex with 3 vertices (a triangle). Classifying the measured neural states according to their distance from the vertices of the simplex then leads to three clusters of neural data. They can be coded and identified in the PC-state space (Allefeld and Bialonski 2007), where the clusters appear as non-intersecting convex sets distinguishing one normal state and one seizure state (composed of two substates). For details see Allefeld et al. (2009, Sec.~V).
Finally, the result of the partitioning can be inspected in the originally recorded time series to check whether mental states are reliably assigned to the correct episodes in the EEG dynamics. The study by Allefeld et al. (2009) shows perfect agreement between the distinction of normal and epileptic states and the bipartition resulting from the spectral analysis of the neural transition matrix.
Another EEG-segmentation algorithm that utilizes the recurrence structure of multivariate time series has been suggested by beim Graben and Hutt (2013, 2015). Their recurrence structure analysis (RSA) partitions the state space into clusters of recurrent, and therefore, overlapping balls obtained from the recurrence plot of the dynamical system (Eckman et al. 1987). Different choices of the radius r of the balls leads to potentially different segmentations of the time series from the corresponding partitions. An optimal choice of r, however, will ideally reflect the dwell times within metastable states and the transitions between metastable states (beim Graben et al. 2016). This can be described by a Markov chain with one distinguished transient state and other states representing the metastable states in the dynamics.
The deviation of a given contextual segmentation from the optimal segmentation can be assessed using a utility function whose maximization leads to a contextually emergent brain microstate segmentation of the EEG. Applying this technique to EEG data from anesthetized ferrets and to event-related brain potentials from human language-processing experiments revealed good correlation with mental states.
According to Dennett (1989), the intentional stance can be applied to the prediction of any system's behavior that is too complex to be treated as either a physical or a designed system. Intentional systems in this sense are systems whose behavior is predictable upon ascribing beliefs and desires to their internal states. Examples for intentional systems range from thermostats and chess computers over "magnetic snakes" (Snezhko et al. 2006) to "true believers", e.g. human beings.
In order to make meaningful predictions of a system, several necessary and sufficient conditions on the system's dynamics must be fulfilled (beim Graben 2014). First of all, the system's dynamics must be non-trivial, thus excluding most kinds of linear systems with periodic oscillations or damped relaxations. The class of putative intentional systems can be embedded into an "intentional hierarchy" ranging from the general case of nonlinear nonequilibrium dissipative systems to more specific intentional systems and "true believers" as a subclass.
Being a physical system is necessary for being a nonlinear dissipative nonequilibrium system; being a nonlinear dissipative nonequilibrium system is necessary for being an intentional system; and being an intentional system is necessary for being a true believer. Moreover, sufficient conditions within the intentional hierarchy implement contextual stability conditions.
The most general case corresponds to the transition from equilibrium thermodynamics to fluid dynamics: The phenomenal laws of fluid dynamics (the Navier-Stokes equations) emerge from statistical mechanics under the assumption of "local equilibrium". At the next level, several sufficient boundary conditions must be selected to give rise to processes of self-organization, nicely illustrated by means of "magnetic snakes". Then, a rationality constraint is imposed for optimal dissipation of pumped energy (Tschacher and Haken 2007). Finally, "true believers" are contextually emergent as intentional systems that are stable under mutual adoption of the intentional stance.
Another application of contextual emergence refers to the symbol grounding problem posed by Harnad (1990). The key issue of symbol grounding is the problem of assigning meaning to symbols on purely syntactic grounds, as proposed by cognitivists such as Fodor and Pylyshin (1988). This entails the question of how conscious mental states can be characterized by their neural correlates, see Atmanspacher and beim Graben (2007). Viewed from a more general perspective, symbol grounding has to do with the relation between analog and digital systems, the way in which syntactic digital symbols are related to the analog behavior of a system they describe symbolically.
An instructive example for this distinction is given by dynamical automata (Tabor 2002, Carmantini et al. 2017). These are piecewise linear (globally nonlinear) time-discrete maps over a two-dimensional state space which assume their interpretation as symbolic computers through a rectangular partition of the unit square. Interestingly, a single point trajectory, i.e. the evolution of microstates (at a lower level description) is not fully interpretable as symbolic computation. Therefore, one has to consider (higher-level) macrostates, based on ensembles of state space points (or probability distributions of points) that evolve under the dynamics.
Beim Graben and Potthast (2012) showed that only uniform probability distributions with rectangular support exhibit a stable dynamics that is interpretable as computation. Thus, the huge space of possible probability distributions must be contextually restricted to the subclass of uniform probability distributions in order to obtain meaningfully grounded symbolic processes. In this sense, symbol grounding is contextually emergent.
It is a long-standing philosophical puzzle how the mind can be causally relevant in a physical world: the problem of mental causation (for reviews see Robb and Heil 2009 and Harbecke 2008, Ch.~1). The question of how mental phenomena can be causes is of high significance for an adequate comprehension of scientific disciplines such as psychology and cognitive neuroscience. Moreover, mental causation is crucial for our everyday understanding of what it means to be an agent in a natural and social environment. Without the causal efficacy of mental states the notion of agency would be nonsensical.
One of the reasons why the causal efficacy of the mental has appeared questionable is that a horizontal (intralevel, diachronic) determination of a mental state by prior mental states seems to be inconsistent with a vertical (interlevel, synchronic) determination of that mental state by neural states. In a series of influential papers and books, Kim has presented his much discussed supervenience argument (also known as exclusion argument), which ultimately amounts to the dilemma that mental states either are causally inefficacious or they hold the threat of overdetermining neural states. In other words: either mental events play no horizontally determining causal role at all, or they are causes of the neural bases of their relevant horizontal mental effects (Kim 2003).
The interlevel relation of contextual emergence yields a quite different perspective on mental causation. It dissolves the alleged conflict between horizontal and vertical determination of mental events as ill-conceived (Harbecke and Atmanspacher 2011). The key point is a construction of properly defined mental states from the dynamics of an underlying neural system. This can be done via statistical neural states based on a proper partition, such that these statistical neural states are coextensive (but not necessarily identical) with individual mental states.
This construction implies that the mental dynamics and the neural dynamics, related to each other by a so-called intertwiner, are topologically equivalent (Atmanspacher and beim Graben 2007). Given properly defined mental states, the neural dynamics gives rise to a mental dynamics that is independent of those neurodynamical details that are irrelevant for a proper construction of mental states.
As a consequence, (i) mental states can indeed be causally and horizontally related to other mental states, and (ii) they are neither causally related to their vertical neural determiners nor to the neural determiners of their horizontal effects. This makes a strong case against a conflict between a horizontal and a vertical determination of mental events and resolves the problem of mental causation in a deflationary manner. Vertical and horizontal determination do not compete, but complement one another in a cooperative fashion. Both together deflate Kim's dilemma and reflate the causal efficacy of mental states. Our conclusions match with and refine the notion of proportionate causation introduced by Yablo (1992).
In this picture, mental causation is a horizontal relation between previous and subsequent mental states, although its efficacy is actually derived from a vertical relation: the downward confinement of (lower-level) neural states originating from (higher-level) mental constraints. This vertical relation is characterized by an intertwiner, a mathematical mapping, which must be distinguished from a causal before-after relation. For this reason, the terms downward causation or top-down causation (Ellis 2008) are infelicitous choices for addressing a downward confinement by contextual constraints.
Within the tradition of dual-aspect thinking, one can distinguish two different, in a sense opposing base conceptions. In one of them, psychophysically neutral elementary entities are composed to sets of such entities, and depending on the composition these sets acquire mental or physical properties. The other base conception refers to a psychophysically neutral domain which does not consist of elementary entities waiting to be composed, but is conceived as one overarching whole that is to be decomposed. In contrast to the atomistic picture of compositional dual-aspect monism, the holistic picture of the decompositional variant is strongly reminiscent of the fundamental insight of entanglement in quantum physics.
The contextual emergence of both the mental and the material from a psychophysically neutral whole requires a fresh look at the conceptual framework, both technically and in terms of the underlying metaphysics. At the technical level, we do now refer to the contextual emergence of multiplicity from unity, fine grains from coarse grains, rather than the other way around (Atmanspacher 2017). The basic idea here is that a "primordial" decomposition of an undivided whole generates (under a particular context) different domains that give rise to differentiations, e.g. the mind-matter distinction.
In the decompositional variety of dual-aspect monism, refinement by symmetry breakdown is conceptually prior to its opposite of generalization, where the restoration of symmetries generates equivalence classes of increasing size. The basic undivided, psychophysically neutral reality is the trivial partition where nothing is distinguished. There is full symmetry and, hence, the corresponding reality is "ineffable", or "discursively inaccessible". Successive decompositions give rise to more and more refined partitions, where symmetries are broken and equivalence classes beome smaller and smaller. Phenomenal families of mental states (Chalmers 2000) illustrate this for the mental domain.
At the metaphysical level, the mental and the physical remain epistemic, but the undivided whole is added as an ontic dimension. This reminds one of Plato's ideas or Kant's things-in-themselves, which are empirically inaccessible in principle and, in this sense, scientificallly mute. Indeed, an undivided whole cannot be further characterized without introducing distinctions that break up the wholeness. Yet, it provides one asset in the metaphyscis of the mind-matter problem that no other philosophical position provides: the emergence of mind-matter correlations as a direct and immediate consequence.
Philosophy of science
Stochastic and deterministic descriptions
Determinism is often understood as a feature of ontic descriptions of states and observables whereas stochasticity refers to epistemic descriptions (Atmanspacher (2002). Mathematical models of classical point mechanics are most common examples of deterministic descriptions, and three properties of these descriptions are particularly important (Bishop 2002): (1) differential dynamics, (2) unique evolution, and (3) value determinateness. (1) means essentially that the system's evolution obeys a differential equation (or some similar algorithm) in a space of ontic states. (2) says that for given initial and boundary conditions there is a unique trajectory. (3) assumes that any state be described with arbitrarily small (non-zero) error.
These three points are not independent from each other but define a hierarchy for the contextual emergence of deterministic descriptions (Bishop and beim Graben 2016). Assuming (1) as a necessary condition for determinism, (2) can be proven under the sufficient condition that the trajectories created by a vector field obeying (1) pass through points whose distance is stable under small perturbations. Assuming (2) for almost every initial condition as a necessary condition of determinism defines a phase flow with weak causality. In order to prove (3) one needs strong causality as a sufficient condition.
For a weakly causal system violating (3), trajectories may exponentially diverge, as in chaotic systems. In this situation, dilation techniques (e.g., Gustafson 2002) can lead to contextually emergent stochasticity in two steps. In the first step, a coarse-graining yields a Markov process. If this process is mixing such that it approaches an equilibrium distribution (Misra et al. 1979), the deterministic dynamics is a Kolmogorov-flow, thereby implementing microscopic chaos as a stability condition (Bishop and beim Graben 2016).
Interestingly, the converse is also possible. For a continuous stochastic process which fulfills the Markov criterion, the master equation approach leads to a deterministic "mean-field equation" (van Kampen 1992). Bishop and beim Graben (2016) showed that this situation is analogous to the paradigmatic example of the contextual emergence of thermal equilibrium states where thermal KMS macrostates are almost pure, and hence almost dispersion-free.
Reproducibility and relevance
Reproducibility is one of the pillars of scientific methodology, yet it becomes particularly difficult in interdisciplinary research where the results to be reproduced typically refer to more than one single level of description of the system considered. In such cases it is mandatory to distinguish the relevant attributes or observables of the system, depending on its description. Usually, different descriptive levels go along with different degrees of granularity. While lower-level descriptions address systems in terms of micro-properties (position, momentum, etc.), other, more global, macro-properties are more suitably taken into account for higher-level descriptions.
This observation led van Fraassen (1980) to the notion of explanatory relativity, where explanations are not only relationships between theories and facts; they are three-place relations between theories, facts, and contexts. The relevance of an explanation is determined by contexts that have to be selected, and are not themselves part of a scientific description.
Explanatory relativity backed up by relevance criteria can vitally serve the discussion of reproducibility across scientific disciplines. Features that are relevant for a proper explanation of some observation should have a high potential to be also relevant for the robust reproduction of that observation. But which properties of systems and their descriptions may be promising candidates for the application of such relevance criteria? One option to highlight relevance criteria is to consider the "granularity" (coarseness) of a description, which usually changes across disciplines.
The transformation between descriptive levels and their associated granularities is possible by the interlevel relation of contextual emergence (Atmanspacher et al. 2014). It yields a formally sound and empirically applicable procedure to construct level-specific criteria for relevant observables across disciplines. Relevance criteria merged with contextual emergence challenge the old idea of one fundamental ontology from which everything else derives. At the same time, the scheme of contextual emergence is specific enough to resist the backlash into a relativist patchwork of unconnected model fragments.
Contextual emergence has been originally conceived as a relation between levels of descriptions, not levels of nature: It addresses questions of epistemology rather than ontology. In agreement with Esfeld (2009), who advocated that ontology needs to regain more significance in science, it would be desirable to know how ontological considerations might be added to the picture that contextual emergence provides.
A network of descriptive levels of varying degrees of granularity raises the question of whether descriptions with finer grains are more fundamental than those with coarser grains. The majority of scientists and philosophers of science in the past tended to answer this question affirmatively. As a consequence, there would be one fundamental ontology, preferentially that of elementary particle physics, to which the terms at all other descriptive levels can be reduced.
But this reductive credo also produced critical assessments and alternative proposals. A philosophical precursor of trends against a fundamental ontology is Quine's (1969) ontological relativity. Quine argued that if there is one ontology that fulfills a given descriptive theory, then there is more than one. It makes no sense to say what the objects of a theory are, beyond saying how to interpret or reinterpret that theory in another theory. Putnam (1981, 1987) later developed a related kind of ontological relativity, first called internal realism, later sometimes modified to pragmatic realism.
On the basis of these philosophical approaches, Atmanspacher and Kronz (1999) suggested how to apply Quine's ideas to concrete scientific descriptions, their relationships with one another, and with their referents. One and the same descriptive framework can be construed as either ontic or epistemic, depending on which other framework it is related to: bricks and tables will be regarded as ontic by an architect, but they will be considered highly epistemic from the perspective of a solid-state physicist.
Coupled with the implementation of relevance criteria due to contextual emergence (Atmanspacher 2016), the relativity of ontology must not be confused with dropping ontology altogether. The "tyranny of relativism" (as some have called it) can be avoided by identifying relevance criteria to distinguish proper context-specific descriptions from less proper ones. The resulting picture is more subtle and more flexible than an overly bold reductive fundamentalism, and yet it is more restrictive and specific than a patchwork of arbitrarily connected model fragments.
1. The combination of contextual emergence with supervenience can be seen as a program that comes conspicuously close to plain reduction. However, there is a subtle difference between the ways in which supervenience and emergence are in fact implemented. (In a related sense, Butterfield (2011a,b) has argued that emergence, supervenience and even reduction are not mutually incompatible.) While supervenience refers to states, the argument by emergence refers to observables. The important selection of a higher-level context leads to a stability criterion for states, but it is also crucial for the definition of the set of observables with which lower-level macrostates are to be associated.
2. An alternative to contextual emergence is the construction of macrostates within an approach called computational mechanics (Shalizi and Crutchfield 2001). A key notion in computational mechanics is the notion of a "causal state". Its definition is based on the equivalence class of histories of a process that are equivalent for predicting the future of the process. Since any prediction method induces a partition of the state space of the system, the choice of an appropriate partition is crucial. If the partition is too fine, too many (irrelevant) details of the process are taken into account; if the partition is too coarse, not enough (relevant) details are considered. As described in detail by Shalizi and Moore (2003), it is possible to determine partitions leading to causal states. This is achieved by minimizing their statistical complexity, the amount of information which the partition encodes about the past. Thus, the approach uses an information theoretical criterion rather than a stability criterion to construct a proper partition for macrostates. Causal states depend on the "subjectively" chosen initial partition but are then "objectively" fixed by the underlying dynamics. This has been expressed succinctly by Shalizi and Moore (2003): Nature has no preferred questions, but to any selected question it has a definite answer.
3. Statistical neural states are multiply realized by individual neural states, and they are coextensive with individual mental states; see also Bechtel and Mundale (1999) who proposed precisely the same idea. There are a number of reasons to distinguish this coextensivity from an identity relation which are beyond the scope of this article. For details see Harbecke and Atmanspacher (2011).
4. The reference to phenomenal families a la Chalmers must not be misunderstood to mean that contextual emergence provides an option to derive the appearance of phenomenal experience from brain behavior. The approach addresses the emergence of mental states still in the sense of a third-person perspective. "How it is like to be" in a particular mental state, i.e. its qualia character, is not addressed at all.
5. Besides the application of contextual emergence under well-controlled experimental conditions, it may be useful also for investigating spontaneous behavior. If such behavior together with its neural correlates is continuously monitored and recorded, it is possible to construct proper partitions of the neural state space. Mapping the time intervals of these partitions to epochs of corresponding behavior may facilitate the characterization of typical paradigmatic behavioral patterns.
6. It is an interesting consequence of contextual emergence that higher-level descriptions constructed on the basis of proper lower-level partitions are compatible with one another. Conversely, improper partitions yield, in general, incompatible descriptions (beim Graben and Atmanspacher 2006). As ad-hoc partitions usually will not be proper partitions, corresponding higher-level descriptions will generally be incompatible. This argument was proposed (Atmanspacher and beim Graben 2007) for an informed discussion of how to pursue "unity in a fragmented psychology", as Yanchar and Slife (2000) put it.
7. For additional directions of research that utilize ideas pertaining to contextual emergence in cognitive science and psychology see Tabor (2002), Dale and Spivey (2005) and Jordan and Ghin (2006). They are similar in spirit, but differ in their scope and details. Applications of synergetics to cognitive science (Haken 2004) and brain science (Haken 2008) offer additional interesting parallels. The concept of closed-loop neuroscience (El Hady 2016) also utilizes the combination of bottom-up and top-down arguments in the study of multilevel systems.
C. Allefeld, H. Atmanspacher, J. Wackermann (2009): Mental states as macrostates emerging from EEG dynamics. Chaos 19, 015102.
C. Allefeld, S. Bialonski (2007): Detecting synchronization clusters in multivariate time series via coarse-graining of Markov chains. Physical Review E 76, 066207.
S.-I. Amari (1974): A method of statistical neurodynamics. Kybernetik 14, 201–215.
S.-I. Amari, K. Yoshida, K.-I. Kanatani (1977): A mathematical foundation for statistical neurodynamics. SIAM Journal of Applied Mathematics 33(1), 95–126.
H. Atmanspacher (2002): Determinism is ontic, determinability is epistemic. In Between Chance and Choice, ed. by H. Atmanspacher and R. Bishop, Imprint Academis, Exeter, pp. 49-74.
H. Atmanspacher (2007): Contextual emergence from physics to cognitive neuroscience. Journal of Consciousness Studies 14(1/2), 18–36.
H. Atmanspacher (2016): Relevance criteria for reproducibility: The contextual emergence of granularity. In Reproducibility - Principles, Problems, Practices, Prospects, ed. by H. Atmanspacher and S. Maasen, Wiley, New York, pp. 525-538.
H. Atmanspacher (2017): Contextual emergence in decompositional dual-aspect monism. Mind and Matter 15, 111-129.
H. Atmanspacher, L. Bezzola, G. Folkers, and P.A. Schubiger (2014): Relevance relations for the concept of reproducibility. Journal of the Royal Society Interface 11(94), 20131030.
H. Atmanspacher and R.C. Bishop (2007): Stability conditions in contextual emergence. Chaos and Complexity Letters 2, 139–150.
H. Atmanspacher and P. beim Graben (2007): Contextual emergence of mental states from neurodynamics. Chaos and Complexity Letters 2, 151–168.
H. Atmanspacher and F. Kronz (1999): Relative onticity. In Quanta, Mind and Matter, ed. by H. Atmanspacher, A. Amann, and U. Müller-Herold, Kluwer, Dordrecht, pp.273-294.
W. Bechtel and J. Mundale (1999): Multiple realizability revisited: Linking cognitive and neural states. Philosophy of Science 66, 175--207.
A. Beckermann, H. Flohr, J. Kim (1992): Emergence or Reduction? de Gruyter, Berlin.
R.C. Bishop (2002): Deterministic and indeterministic descriptions. In Between Chance and Choice, ed. by H. Atmanspacher and R. Bishop, Imprint Academis, Exeter, pp. 5-31.
R.C. Bishop (2005): Patching physics and chemistry together. Philosophy of Science 72, 710–722.
R.C. Bishop (2008): Downward causation in fluid convection. Synthese 160, 229--248.
R.C. Bishop and H. Atmanspacher (2006): Contextual emergence in the description of properties. Foundations of Physics 36, 1753–1777.
R.C. Bishop and P. beim Graben (2016): Contextual emergence of deterministic and stochastic descriptions. In From Chemistry to Consciousness. The Legacy of Hans Primas, ed. by H. Atmanspacher and U. Müller-Herold, Springer, Berlin, pp. 95-110.
J. Butterfield (2012): Laws, causation and dynamics at different levels. Interface Focus 2, 101--114.
J. Butterfield (2011a): Emergence, reduction and supervenience: A varied landscape. Foundations of Physics 41, 920--960.
J. Butterfield (2011b): Less is different: emergence and reduction reconciled. Foundations of Physics 41, 1065--1135.
G.S. Carmantini, P. beim Graben, M. Desroches, S. Rodrigues (2017): A modular architecture for transparent computation in recurrent neural networks. Neural Networks 85, 85-105.
D. Chalmers (2000): What is a neural correlate of consciousness? In Neural Correlates of Consciousness, ed. by T. Metzinger, MIT Press, Cambridge, pp.17–39.
S. Chibbaro, L. Rondoni, A. Vulpiani (2014): Reductionism, Emergence and Levels of Reality, Springer, Berlin.
I.P. Cornfeld, S.V. Fomin, Ya.G. Sinai (1982): Ergodic Theory, Springer, Berlin, pp.250–252, 280–284.
R. Dale (2008): The possibility of a pluralist cognitive science. Journal of Experimental and Theoretical Artificial Intelligence 20, 155--179.
R. Dale, M. Spivey (2005): From apples and oranges to symbolic dynamics: A framework for conciliating notions of cognitive representations. Journal of Experimental and Theoretical Artificial Intelligence 17, 317–342.
D.C. Dennett (1989): The Intentional Stance, MIT Press, Cambridge.
W. de Roeck, J. Fröhlich (2011): Diffusion of a massive quantum particle coupled toa quasi-free thermal medium. Communications in Mathematical Physics 293, 361--398.
P. Deuflhard, M. Weber (2005): Robust Perron cluster analysis in conformation dynamics. Linear Algebra and its Applications 398, 161–184.
J.-P. Eckmann, S.O. Kamphorst, D. Ruelle (1987): Recurrence plots of dynamical systems. Europhysics Letters 4(9), 973-977.
G.F.R. Ellis (2008): On the nature of causation in complex systems. Transactions of the Royal Society of South Africa 63, 69--84.
M. Esfeld (2009): The rehabilitation of a metaphysics of nature. In The Significance of the Hypothetical in the Natural Sciences, ed. by M. Heidelberger and G. Schiemann, deGruyter, Berlin, in press.
J. Fell (2004): Identifying neural correlates of consciousness: The state space approach. Consciousness and Cognition 13, 709–729.
J. Fodor and Z.W. Pylyshyn (1988): Connectionism and cognitive architecture: A critical analysis. Cognition 28, 3--71.
C.D. Frith (2011): What brain plasticity reveals about the nature of consciousness: Commentary. Frontiers in Psychology 2, doi: 10.3389/fpsyg.2011.00087.
J. Fröhlich, Z. Gang, A. Soffer (2011): Some Hamiltonian models of friction. Journal of Mathematical Physics 52, 83508/1--13.
G. Froyland (2005): Statistically optimal almost-invariant sets. Physica D 200, 205–219.
B. Gaveau, L.S. Schulman (2005): Dynamical distance: coarse grains, pattern recognition, and network analysis. Bulletin de Sciences Mathematiques 129, 631–642.
C. Gillett (2002): The varieties of emergence: Their purposes, obligations and importance. Grazer Philosophische Studien 65, 95–121.
P. beim Graben (2014): Contextual emergence of intentionality. Journal of Consciousness Studies 21(5-6), 75-96.
P. beim Graben (2016): Contextual emergence in neuroscience. In Closed Loop Neuroscience, ed. by A. El Hady, Elsevier, Amsterdam, PP. 171-184.
P. beim Graben, H. Atmanspacher (2006): Complementarity in classical dynamical systems. Foundations of Physics 36, 291–306.
P. beim Graben, A. Barrett, H. Atmanspacher (2009): Stability criteria for the contextual emergence of macrostates in neural networks. Network: Computation in Neural Systems 20, 177-195.
P. beim Graben, A. Hutt (2013): Detecting recurrence domains of dynamical systems by symbolic dynamics. Physical Review Letters, 110(15), 154101.
P. beim Graben, A. Hutt (2015): Detecting event-related recurrences by symbolic analysis: Applications to human language processing. Philosophical Transactions of the Royal Society London A373, 20140089.
P. beim Graben, R. Potthast (2012): Implementing Turing machines in dynamic field architectures. In Proceedings of AISB12 World Congress 2012 - Alan Turing 2012, ed. by M. Bishop, Y.-J. Erden, pp. 36-40
P. beim Graben, K.K. Sellers, F. Fröhlich, A. Hutt (2016): Optimal estimation of recurrence structures from time series. Europhysics Letters 114, 38003.
K. Gustafson (2002): Time-space dilations and stochastic-deterministic dynamics. In Between Chance and Choice, ed. by H. Atmanspacher and R. Bishop, Imprint Academis, Exeter, pp. 115-148.
R. Haag, D. Kastler, E.B. Trych-Pohlmeyer (1974): Stability and equilibrium states. Communications in Mathematical Physics 38, 173–193.
A. El Hady, ed. (2016): Closed Loop Neuroscience, Elsevier, Amsterdam.
H. Haken (2004): Synergetic Computers and Cognition, Spriner, Berlin.
H. Haken (2008): Brain Dynamics, Springer, Berlin.
J. Harbecke (2008): Mental Causation. Investigating the Mind's Powers in a Natural World, Ontos, Frankfurt.
J. Harbecke and H. Atmanspacher (2011): Horizontal and vertical determination of mental and neural states. Journal of Theoretical and Philosophical Psychology, in press.
S.~Harnad (1990): The symbol grounding problem. Physica D 42, 335--346.
A.L. Hodgkin, A.F. Huxley (1952): A quantitative description of membrane current and its application to conduction and excitation in nerve. Journal of Physiology 117, 500-544.
J.S. Jordan and M. Ghin (2006): (Proto-) consciousness as a contextually emergent property of self-sustaining systems. Mind and Matter 4(1), 45–68.
J. Kim (1992): Multiple realization and the metaphysics of reduction. Philosophy and Phenomenological Research 52, 1–26.
J. Kim (1993): Supervenience and Mind, Cambridge University Press, Cambridge.
J. Kim (2003): Blocking causal drainage and other maintenance chores with mental causation. Philosophy and Phenomenological Research 67, 151--176.
D. Lind, B. Marcus (1995): Symbolic Dynamics and Coding, Cambridge University Press, Cambridge.
B. Misra, I. Prigogine, M. Courbage (1979): From deterministic dynamics to probabilistic descriptions. Proceedings of the National Academy of Sciences of the USA 76(8), 3607-3611.
H. Primas (1998): Emergence in the exact sciences. Acta Polytechnica Scandinavica 91, 83–98.
H. Putnam (1981): Reason, Truth and History, Cambridge University Press, Cambridge.
H. Putnam (1987): The Many Faces of Realism, Open Court, La Salle.
W.V.O. Quine (1969): Ontological relativity. In Ontological Relativity and Other Essays, Columbia University Press, New York, pp. 26-68.
D. Robb and J. Heil (2009). Mental causation. In The Stanford Encyclopedia of Philosophy, ed. by E.~N. Zalta, Summer 2009 ed.
C.R. Shalizi, J.P. Crutchfield (2001): Computational mechanics: Pattern and prediction, structure and simplicity. Journal of Statistical Physics 104, 817–879.
C.R. Shalizi, C. Moore (2003): What is a macrostate? Subjective observations and objective dynamics. Preprint available at LANL cond-mat/0303625.
A. Snezhko, I.S. Aranson, W.-K. Kwok (2006): Surface wave assisted self-assembly of multidomain magnetic structures. Physical Review Letters 96, 078701.
A. Stephan (1999): Emergenz. Von der Unvorhersagbarkeit zur Selbstorganisation, Dresden University Press, Dresden.
W. Tabor (2002): The value of symbolic computation. Ecological Psychology 14, 21--51.
M. Takesaki (1970): Disjointness of the KMS states of different temperatures. Communications in Mathematical Physics 17, 33–41.
W. Tschacher, H. Haken (2007): Intentionality in non-equilibrium systems? The functional aspects of self-organized pattern formation. New Ideas in Psychology 25, 1-15.
B. van Fraassen (1980): The Scientific Image, Clarendon, Oxford.
N.G. van Kampen (1992): Stochastic Processes in Physics and Chemistry, Elsevier, Amsterdam.
S. Yablo (1992): Mental causation. Philosophical Review 101, 245--280.
S.C. Yanchar, B.D. Slife (1997): Pursuing unity in a fragmented psychology: Problems and prospects. Review of General Psychology 1, 235–255.
- Edward Ott (2006) Basin of attraction. Scholarpedia, 1(8):1701.
- Valentino Braitenberg (2007) Brain. Scholarpedia, 2(11):2918.
- Olaf Sporns (2007) Complexity. Scholarpedia, 2(10):1623.
- Paul L. Nunez and Ramesh Srinivasan (2007) Electroencephalogram. Scholarpedia, 2(2):1348.
- Tomasz Downarowicz (2007) Entropy. Scholarpedia, 2(11):3901.
- Eugene M. Izhikevich (2007) Equilibrium. Scholarpedia, 2(10):2014.
- Rodolfo Llinas (2008) Neuron. Scholarpedia, 3(8):1490.
- Philip Holmes and Eric T. Shea-Brown (2006) Stability. Scholarpedia, 1(10):1838.
- David H. Terman and Eugene M. Izhikevich (2008) State space. Scholarpedia, 3(3):1924.
- Brian Marcus and Susan Williams (2008) Symbolic dynamics. Scholarpedia, 3(11):2923.