Integrated information theory

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Not to be confused with Information integration theory.

Integrated information theory (IIT) is a proposed theoretical framework intended to understand and explain the nature of consciousness. It was developed by psychiatrist and neuroscientist Giulio Tononi of the University of Wisconsin–Madison.[1] Tononi's initial ideas were further developed by Adam Barrett, who created similar measures of integrated information[2] such as "phi empirical".

Overview[edit]

Schematic diagram of how to decompose systems into overlapping complexes according to Tononi's information integration theory

The integrated information theory (IIT) of consciousness attempts to explain consciousness, or conscious experience, at the fundamental level using a principled, theoretical framework. The theory starts from two key postulations regarding the nature of consciousness: That consciousness has information regarding its experience, and that the experience is integrated to the extent that parts of an experience are informative of each other.[1]

Here, IIT embraces the information theoretical sense of information; that is, information is the reduction in uncertainty regarding the state of a variable, and conversely is what increases in specifying a variable with a growing number of possible states. When applied to conscious experience as we know it, since the number of different possible experiences generated by a human consciousness is considerably large, the amount of information this conscious system must hold should also be large. The list of a system's possible states is called its "repertoire" in IIT.

In a system composed of connected "mechanisms" (nodes containing information and causally influencing other nodes), the information among them is said to be integrated if and to the extent that there is a greater amount of information in the repertoire of a whole system regarding its previous state than there is in the sum of the all the mechanisms' considered individually. In this way, integrated information does not increase by simply adding more mechanisms to a system if the mechanisms are independent of each other. Applied to consciousness, parts of an experience (qualia) such as color and shape are not experienced separately for the reason that they are integrated, unified in a single, whole experience; applied in another way, our digestive system is not considered part of our consciousness because the information generated in the body is not intrinsically integrated with the brain.

In IIT 3.0, the 2014 revision of IIT, five axioms were established in underpinning the theory:[3]

  • Consciousness exists
  • Consciousness is compositional (structured)
  • Consciousness is informative
  • Consciousness is integrated
  • Consciousness is exclusive

See the original article for more information.

The suggestion is that the quantity of consciousness in a system is measured by the amount of integrated information it generates.

Qualia[edit]

The theory further proposes a way to characterize of the quality of the experience, qualia itself, using a multidimensional space called qualia space (Q), wherein the ways mechanisms connect together form a shape within the space; the shape itself describes the qualia that is experienced.[4] (In the later IIT 3.0, qualia space -- no longer referred to as Q -- takes into account the temporal aspects of a state.)

Q has an axis for every possible informational state of the system; any point in this space will have a component for each state. We can view the components as probabilities that the system is in that state, thus a point in Q represents a probability distribution, a repertoire, of what the system's state may be in. If all states in Q are equally probable (maximum entropy, all mechanisms are independent), then that is represented by a point with the same value at every dimension, summing to a magnitude of 1.

The quale of a properly connected system can be built up in Q by plotting the repertoires of the system being incrementally connected in all combinations. The line joining the base maximum entropy repertoire to a repertoire of a system with a single connection forms a "q-arrow". Multiple connections correspond to the concatenation of multiple q-arrows called "q-edges". The ultimate Q point corresponds to the actual, fully connected system, and is the point where all the q-arrows lead to. Finally, the multidimensional shape delimited by the q-edges can be called the quale produced by the system.

Calculating integrated information[edit]

Relative entropy/effective information[edit]

ei(X(mech,x_1)) = H[p(X_0(mech,x_1)) \parallel p(X_0(maxH))]

where X is our system, mech is that system's mechanism, x_1 is a state of the system, and p(X_0(maxH)) is the uniform or potential distribution.

Effective Information is defined as the relative entropy H between the actual and potential repertoires, the Kullback–Leibler divergence. It is implicitly specified by mechanism and state, so it is an 'intrinsic' property of the system. The actual repertoire of states is calculated by perturbing the system in all possible ways to obtain the forward repertoire of output states. After that, Bayes' Rule is applied.

Example[edit]

Consider a system of two binary elements. It has four possible states (00, 01, 10, 11).

The first binary element operates randomly. The second binary element will be whatever the first element was in the previous state. Initially: (0, 0).

Its maximum entropy is given: p = (1/4, 1/4, 1/4, 1/4)
Given state is 11 at time t
Previous state must have been 11 or 10; hence, p = (0, 0, 1/2, 1/2)
Applying the above equation, the effective information generated by the system is 1 bit.

Integration (Φ)[edit]

The measure of integrated information is denoted by the letter Phi (Φ).

\Phi (X(mech,x_1)) = H[p(X_0(mech,x_1)) \parallel \Pi p( ^k M_0(mech,\mu_1))] for ^k M_0 \in MIP

where X is our system, mech is that system's mechanism, x_1 is a state of the system, and \Pi(p(^k M_0(mech,\mu_1))) is the product of all the probability distributions of each part of the system in the minimal information partition.

Φ will be high when there is a lot of information generated among the parts of a system as opposed to within them.

Qualia[edit]

Using relative entropy, the amount of information generated by a single connection c within the system is quantified by the equation:

\Phi_c = H[p(X(mech,x)) \parallel p(Y(mech,y))]

where Y is the system with connection c removed.

Thus there are points Y and X in qualia space that correspond to the probability distributions of the system respectively with and without the connection c. The vector drawn from Y to X has length \Phi_c, is associated with the connection c, and is called a q-arrow.

References[edit]

  1. ^ a b Tononi, Giulio (December 2008). "Consciousness as integrated information: a provisional manifesto". The Biological Bulletin 215 (3): 216–242. doi:10.2307/25470707. ISSN 0006-3185. PMID 19098144. 
  2. ^ Barrett, A.B., & Seth, A.K. (2011). Practical measures of integrated information for time-series data. PLoS Comput. Biol., 7(1): e1001052
  3. ^ Oizumi, Masafumi; Albantakis, Larissa; Tononi, Guilio (8 May 2014). "From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0". PLoS Computational Biology 10 (5). doi:10.1371/journal.pcbi.1003588. Retrieved 15 August 2014. 
  4. ^ Balduzzi, David; Tononi, Guilio (14 August 2009). "Qualia: The Geometry of Integrated Information". PLoS Computational Biology 5 (8). doi:10.1371/journal.pcbi.1000462. Retrieved 15 August 2014. 

External links[edit]

Online videos[edit]