An adaptive system is a set of interacting or interdependent entities, real or abstract, forming an integrated whole that together are able to respond to environmental changes or changes in the interacting parts, in a way analogous to either continuous physiological homeostasis or evolutionary adaptation in biology. Feedback loops represent a key feature of adaptive systems, such as ecosystems and individual organisms; or in the human world, communities, organizations, and families.

Artificial adaptive systems include robots with control systems that utilize negative feedback to maintain desired states.

## Contents

The law of adaptation can be stated informally as:

Every adaptive system converges to a state in which all kind of stimulation ceases.[1]

Formally, the law can be defined as follows:

Given a system ${\displaystyle S}$, we say that a physical event ${\displaystyle E}$ is a stimulus for the system ${\displaystyle S}$ if and only if the probability ${\displaystyle P(S\rightarrow S'|E)}$ that the system suffers a change or be perturbed (in its elements or in its processes) when the event ${\displaystyle E}$ occurs is strictly greater than the prior probability that ${\displaystyle S}$ suffers a change independently of ${\displaystyle E}$:

${\displaystyle P(S\rightarrow S'|E)>P(S\rightarrow S')}$

Let ${\displaystyle S}$ be an arbitrary system subject to changes in time ${\displaystyle t}$ and let ${\displaystyle E}$ be an arbitrary event that is a stimulus for the system ${\displaystyle S}$: we say that ${\displaystyle S}$ is an adaptive system if and only if when t tends to infinity ${\displaystyle (t\rightarrow \infty )}$ the probability that the system ${\displaystyle S}$ change its behavior ${\displaystyle (S\rightarrow S')}$ in a time step ${\displaystyle t_{0}}$ given the event ${\displaystyle E}$ is equal to the probability that the system change its behavior independently of the occurrence of the event ${\displaystyle E}$. In mathematical terms:

1. - ${\displaystyle P_{t_{0}}(S\rightarrow S'|E)>P_{t_{0}}(S\rightarrow S')>0}$
2. - ${\displaystyle \lim _{t\rightarrow \infty }P_{t}(S\rightarrow S'|E)=P_{t}(S\rightarrow S')}$

Thus, for each instant ${\displaystyle t}$ will exist a temporal interval ${\displaystyle h}$ such that:

${\displaystyle P_{t+h}(S\rightarrow S'|E)-P_{t+h}(S\rightarrow S')

In an adaptive system, a parameter changes slowly and has no preferred value. In a self-adjusting system though, the parameter value “depends on the history of the system dynamics”. One of the most important qualities of self-adjusting systems is its “adaptation to the edge of chaos” or ability to avoid chaos. Practically speaking, by heading to the edge of chaos without going further, a leader may act spontaneously yet without disaster. A March/April 2009 Complexity article further explains the self-adjusting systems used and the realistic implications.[2] Physicists have shown that adaptation to the edge of chaos occurs in almost all systems with feedback.[3]

## Practopoiesis

Practopoiesis, a term due to its originator Danko Nikolić, is a reference to a kind of adaptive or self-adjusting system in which autopoiesis of an organism or a cell occurs through allopoietic interactions among its components.[4] The components are organized into a poietic hierarchy: one component creates another. For example, according to this proposal, in the brain this hierarchy leads to the capability of learning to learn. The theory proposes that living systems exhibit a hierarchy of four such poietic operations in total:

   evolution (i) → gene expression (ii) → non gene-involving homeostatic mechanisms (anapoiesis) (iii) → cell function (iv)


Practopoiesis challanges current neuroscience doctrine by asserting that mental operations primarily occur at the anapoietic level (iii) — i.e., that minds emerge from fast homeostatic mechanisms. This contrasts the widespread belief that thinking is synonymous with neural activity (level iv).