Jump to content

Adaptive system

From Wikipedia, the free encyclopedia
(Redirected from Adaptive systems)

An adaptive system is a set of interacting or interdependent entities, real or abstract, forming an integrated whole that together are able to respond to environmental changes or changes in the interacting parts, in a way analogous to either continuous physiological homeostasis or evolutionary adaptation in biology. Feedback loops represent a key feature of adaptive systems, such as ecosystems and individual organisms; or in the human world, communities, organizations, and families. Adaptive systems can be organized into a hierarchy.

Artificial adaptive systems include robots with control systems that utilize negative feedback to maintain desired states.

The law of adaptation

[edit]

The law of adaptation may be stated informally as:

Every adaptive system converges to a state in which all kind of stimulation ceases.[1]

Formally, the law can be defined as follows:

Given a system , we say that a physical event is a stimulus for the system if and only if the probability that the system suffers a change or be perturbed (in its elements or in its processes) when the event occurs is strictly greater than the prior probability that suffers a change independently of :

Let be an arbitrary system subject to changes in time and let be an arbitrary event that is a stimulus for the system : we say that is an adaptive system if and only if when t tends to infinity the probability that the system change its behavior in a time step given the event is equal to the probability that the system change its behavior independently of the occurrence of the event . In mathematical terms:

  1. -
  2. -

Thus, for each instant will exist a temporal interval such that:

Benefit of self-adjusting systems

[edit]

In an adaptive system, a parameter changes slowly and has no preferred value. In a self-adjusting system though, the parameter value “depends on the history of the system dynamics”. One of the most important qualities of self-adjusting systems is its “adaptation to the edge of chaos” or ability to avoid chaos. Practically speaking, by heading to the edge of chaos without going further, a leader may act spontaneously yet without disaster. A March/April 2009 Complexity article further explains the self-adjusting systems used and the realistic implications.[2] Physicists have shown that adaptation to the edge of chaos occurs in almost all systems with feedback.[3]

Hierarchy of adaptations: Practopoiesis

[edit]
The feedback loops and poietic interaction in hierarchical adaptations.

A groundbreaking theory of practopoiesis explains how various types of adaptations interact in a living system? Practopoiesis,[4] a term due to its originator Danko Nikolić,[5] is a reference to a hierarchy of adaptation mechanisms answering this question. The adaptive hierarchy forms a kind of a self-adjusting system in which autopoiesis of the entire organism or a cell occurs through a hierarchy of allopoietic interactions among components.[6] This is possible because the components are organized into a poietic hierarchy: adaptive actions of one component result in creation of another component. The theory proposes that living systems exhibit a hierarchy of a total of four such adaptive poietic operations:

   evolution (i) → gene expression (ii) → non gene-involving homeostatic mechanisms (anapoiesis) (iii) → final cell function (iv)

As the hierarchy evolves towards higher levels of organization, the speed of adaptation increases. Evolution is the slowest; gene expression is faster; and so on. The final cell function is the fastest. Ultimately, practopoiesis challenges current neuroscience doctrine by asserting that mental operations primarily occur at the homeostatic, anapoietic level (iii) — i.e., that minds and thought emerge from fast homeostatic mechanisms poietically controlling the cell function. This contrasts the widespread assumption that thinking is synonymous with computations executed at the level of neural activity (i.e., with the 'final cell function' at level iv).

Sharov proposed that only Eukaryote cells can achieve all four levels of organization.[7]

Each slower level contains knowledge that is more general than the faster level; for example, genes contain more general knowledge than anapoietic mechanisms, which in turn contain more general knowledge than cell functions. This hierarchy of knowledge enables the anapoietic level to implement concepts, which are the most fundamental ingredients of a mind. Activation of concepts through anapoiesis is suggested to underlie ideasthesia. Practopoiesis also has implications for understanding the limitations of Deep Learning.[8]

Empirical tests of practopoiesis require learning on double-loop tasks: One needs to assess how the learning capability adapts over time, i.e., how the system learns to learn (adapts its adapting skills).[9][10]

It has been proposed that anapoiesis is implemented in the brain by metabotropic receptors and G protein-gated ion channels.[11] These membrane proteins are suggested to transiently select subnetworks and by doing so, give raise to cognition.

See also

[edit]

Notes

[edit]
  1. ^ José Antonio Martín H., Javier de Lope and Darío Maravall: "Adaptation, Anticipation and Rationality in Natural and Artificial Systems: Computational Paradigms Mimicking Nature" Natural Computing, December, 2009. Vol. 8(4), pp. 757-775. doi
  2. ^ Hübler, A. & Wotherspoon, T.: "Self-Adjusting Systems Avoid Chaos". Complexity. 14(4), 8 – 11. 2008
  3. ^ Wotherspoon, T.; Hubler, A. (2009). "Adaptation to the edge of chaos with random-wavelet feedback". J Phys Chem A. 113 (1): 19–22. Bibcode:2009JPCA..113...19W. doi:10.1021/jp804420g. PMID 19072712.
  4. ^ "Practopoiesis".
  5. ^ "Danko Nikolić (Max Planck Institute for Brain Research, Frankfurt am Main) on ResearchGate - Expertise: Artificial Intelligence, Quantitative Psychology, Cognitive Psychology". Archived from the original on 2015-07-23.
  6. ^ Danko Nikolić (2015). "Practopoiesis: Or how life fosters a mind". Journal of Theoretical Biology. 373: 40–61. arXiv:1402.5332. Bibcode:2015JThBi.373...40N. doi:10.1016/j.jtbi.2015.03.003. PMID 25791287. S2CID 12680941.
  7. ^ Sharov, A. A. (2018). "Mind, agency, and biosemiotics." Journal of Cognitive Science, 19(2), 195-228.
  8. ^ Nikolić, D. (2017). "Why deep neural nets cannot ever match biological intelligence and what to do about it?" International Journal of Automation and Computing, 14(5), 532-541.
  9. ^ El Hady, A. (2016). Closed loop neuroscience. Academic Press.
  10. ^ Dong, X., Du, X., & Bao, M. (2020). "Repeated contrast adaptation does not cause habituation of the adapter." Frontiers in Human Neuroscience, 14, 569. (https://www.frontiersin.org/articles/10.3389/fnhum.2020.589634/full)
  11. ^ Nikolić, D. (2023). Where is the mind within the brain? Transient selection of subnetworks by metabotropic receptors and G protein-gated ion channels. Computational Biology and Chemistry, 107820.

References

[edit]
[edit]