# Integrated information theory

Integrated information theory (IIT) is a proposed theoretical framework intended to understand and explain the nature of consciousness. It was developed by psychiatrist and neuroscientist Giulio Tononi of the University of Wisconsin–Madison.[1]

## Overview

Schematic diagram of how to decompose systems into overlapping complexes according to Tononi's information integration theory

The theory is based on two key propositions. The first is that every conscious state or moment contains a massive amount of information. A common example of this is every frame in a movie. Upon seeing a single frame of a movie you have watched you instantly associate it with a "specific conscious percept."[2] That is to say you can discriminate a given frame in a film from any other frame, including a blank, black screen. The mind, therefore, can discriminate amongst a massive number of possible visual states. This is a tremendous amount of information being represented. Compare our visual awareness to a simple photodiode which only can discriminate the presence of light from dark. It doesn't matter if the light is a lightbulb, a scene from Ben Hur or the bright light of noon on a summer day, the photodiode represents only minimal information.

The second key proposition is that all of the information you have gleaned from conscious states is highly, and innately, integrated into your mind. It is impossible for you to see the world apart from all of the information that you are conscious of. When you are looking at an orange, for example, you cannot separate the color of the fruit (orange) from its shape (round). Even though color processing and spatial processing are separately localized in the brain (a stroke victim can lose color perception yet maintain perfect spatial awareness, for example) conscious experiences cannot be atomized into distinct parts.

The suggestion is that the amount of consciousness an entity has is equal to the amount of integrated information it can process.

From this premise, an equation is developed to define and quantify the amount of integrated information present, and the suggestion is that this quantity directly defines the level of consciousness of the information processor.

Giulio Tononi's initial ideas were further developed by Adam Barrett, who created similar measures of integrated information[3] such as "phi empirical".

## A definition of consciousness

IIT proposes that consciousness arises as a property of a physical system, its 'integrated information'. Integrated information is defined in such a way that it can be measured and quantified using mathematical equations. This definition deviates markedly from standard definitions of Consciousness.

## Calculating integrated information

### Information

Given a System (including current probability distribution) and Mechanism (which specifies the possible next state probability distribution, if the current state is perturbed with all possible inputs), one can determine the actual distribution (Possible system states at time t = -1). Thus, System and Mechanism constitute information (about the system's previous state), in the classic sense of 'reduction of uncertainty.'

### Relative entropy/effective information

Effective Information is defined as the relative entropy H between the actual and potential repertoires, the Kullback–Leibler divergence.

It is implicitly specified by mechanism and state, so it is an 'intrinsic' property of the system.

The actual repertoire of states is calculated by perturbing the system in all possible ways to obtain the forward repertoire of output states. After that, Bayes' Rule is applied.

#### Example

System of two binary elements – four possible states (00, 01, 10, 11)

The first binary element operates randomly. The second binary element will be whatever the first element was in the previous state. Initially: (0, 0). maximum entropy: p = (1/4, 1/4, 1/4, 1/4)
Given, at time t, state is 11
Previous state must have been 11 or 10, p = (0, 0, 1/2, 1/2)
Generated one bit of information since $ei(X(mech,x_1)) = H[p(X_0(mech,x_1)) \parallel p(X_0(maxH))]$
where X is our system, mech is that system's mechanism, x1 is a state of the system, and p(X0(maxH)) is the uniform or potential distribution.

### Integration (Φ)

$\Phi (X(mech,x_1)) = H[p(X_0(mech,x_1)) \parallel \Pi p( ^k M_0(mech,\mu_1))]$ for $^k M_0 \in MIP$

where X is our system, mech is that system's mechanism, $x_1$ is a state of the system, $\Pi(p(^k M_0(mech,\mu_1)))$ is the product of all the probability distributions of each part of the system in the minimal information partition.

It's clear then that $\Phi$ will be high when there is a lot of information generated among the parts of a system as opposed to within them.

### Complexes

A complex is a set of elements that generate integrated information that is not fully contained in a larger set of higher $\Phi$.

This then leads naturally to the notion of a main complex, which is the complex in a system that generates the largest amount of $\Phi$. Note that a main complex can partially contain complexes of lower $\Phi$ within it.

## Interpretations of different aspects of consciousness

### Quality of consciousness

Philosophers call the individual perceptions of the conscious mind qualia, for example the 'red' quale is the distinctive redness experience on seeing the color red.

IIT defines a multi-dimensional space called qualia space, or Q-space. This space has an axis for every state of the system. Any point in this space therefore has a component for every state. If we restrict the components to be numbers from 0 to 1, then we can view the components as probabilities that the system is in that state. Thus a point in such a Q-space represents a probability distribution.

Using relative entropy, the amount of information generated by a single connection c within the system is quantified via the equation:

$\Phi_c = H[p(X(mech,x)) \parallel p(Y(mech,y))]$

where Y is the system with that connection removed. Thus there are points Y and X in Q-space that correspond to the probability distributions of the system respectively with and without the connection c. The vector drawn from Y to X has length $\Phi_c$, is associated with the connection c and is called a q-arrow. Such a q-arrow is a representation of the informational relationship specified by a connection.

## References

1. ^ Tononi, Giulio (December 2008). "Consciousness as integrated information: a provisional manifesto". The Biological Bulletin 215 (3): 216–242. doi:10.2307/25470707. ISSN 0006-3185. PMID 19098144.
2. ^ Koch, Christof. "A "Complex" Theory of Consciousness". Scientific American. Retrieved 2012-04-18.
3. ^ Barrett, A.B., & Seth, A.K. (2011). Practical measures of integrated information for time-series data. PLoS Comput. Biol., 7(1): e1001052