Reservoir computing
Reservoir computing is a framework for computation derived from recurrent neural network theory that maps input signals into higher dimensional computational spaces through the dynamics of a fixed, non-linear system called a reservoir.[1] After the input signal is fed into the reservoir, which is treated as a "black box," a simple readout mechanism is trained to read the state of the reservoir and map it to the desired output.[1] The first key benefit of this framework is that training is performed only at the readout stage, as the reservoir dynamics are fixed.[1] The second is that the computational power of naturally available systems, both classical and quantum mechanical, can be utilized to reduce the effective computational cost.[2]
History
The concept of reservoir computing stems from the use of recursive connections within neural networks to create a complex dynamical system.[3] The resultant complexity of such recurrent neural networks was found to be useful in solving a variety of problems including language processing and dynamic system modeling.[3] However, training of recurrent neural networks is challenging and computationally expensive.[3] Reservoir computing reduces those training-related challenges by fixing the dynamics of the reservoir and only training the linear output layer.[3]
Recent advances in both AI and quantum information theory have given rise to the concept of quantum neural networks.[4] These hold promise in quantum information processing, which is challenging to classical networks, but can also find application in solving classical problems.[4][5] In 2018, a physical realization of a quantum reservoir computing architecture was demonstrated in the form of nuclear spins within a molecular solid.[5] In 2019, another possible implementation of quantum reservoir processors was proposed in the form of two-dimensional fermionic lattices.[5]
Classical reservoir computing
Reservoir
The 'reservoir' in reservoir computing is the internal structure of the computer, and must have two properties: it must be made up of individual, non-linear units, and it must be capable of storing information.[6] The non-linearity describes the response of each unit to input, which is what allows reservoir computers to solve complex problems.[6] Reservoirs are able to store information by connecting the units in recurrent loops, where the previous input affects the next response.[6] The change in reaction due to the past allows the computers to be trained to complete specific tasks.[6]
Reservoirs can be virtual or physical.[6] Virtual reservoirs are typically randomly generated and are designed like neural networks.[6][3] Virtual reservoirs can be designed to have non-linearity and recurrent loops, but, unlike neural networks, the connections between units are randomized and remain unchanged throughout computation.[6] Physical reservoirs are possible because of the inherent non-linearity of certain natural systems.[1] The interaction between ripples on the surface of water contains the nonlinear dynamics required in reservoir creation, and a pattern recognition RC was developed by first inputting ripples with electric motors then recording and analyzing the ripples in the readout.[1]
Readout
The readout is a neural network layer that performs a linear transformation on the output of the reservoir.[1] The weights of the readout layer are trained by analyzing the spatiotemporal patterns of the reservoir after excitation by known inputs, and by utilizing a training method such as a linear regression or a Ridge regression.[1] As its implementation depends on spatiotemporal reservoir patterns, the details of readout methods are tailored to each type of reservoir.[1] For example, the readout for a reservoir computer using a container of liquid as its reservoir might entail observing spatiotemporal patterns on the surface of the liquid.[1]
Types
Context reverberation network
An early example of reservoir computing was the context reverberation network.[7] In this architecture, an input layer feeds into a high dimensional dynamical system which is read out by a trainable single-layer perceptron. Two kinds of dynamical system were described: a recurrent neural network with fixed random weights, and a continuous reaction-diffusion system inspired by Alan Turing’s model of morphogenesis. At the trainable layer, the perceptron associates current inputs with the signals that reverberate in the dynamical system; the latter were said to provide a dynamic "context" for the inputs. In the language of later work, the reaction-diffusion system served as the reservoir.
Echo state network
The Tree Echo State Network (TreeESN) model represents a generalization of the reservoir computing framework to tree structured data.[8]
Liquid-state machine
Nonlinear transient computation
This type of information processing is most relevant when time-dependent input signals depart from the mechanism’s internal dynamics.[9] These departures cause transients or temporary altercations which are represented in the device’s output.[9]
Deep reservoir computing
The extension of the reservoir computing framework towards Deep Learning, with the introduction of Deep Reservoir Computing and of the Deep Echo State Network (DeepESN) model[10][11][12][13] allows to develop efficiently trained models for hierarchical processing of temporal data, at the same time enabling the investigation on the inherent role of layered composition in recurrent neural networks.
Quantum reservoir computing
Quantum reservoir computing utilizes the nonlinear nature of quantum mechanical interactions or processes to form the characteristic nonlinear reservoirs.[4][5]
Types
2-D Fermionic lattices
In this architecture, randomized coupling between lattice sites grants the reservoir the “black box” property inherent to reservoir processors.[4] The reservoir is then excited, which acts as the input, by an incident optical field. Readout occurs in the form of occupational numbers of lattice sites, which are naturally nonlinear functions of the input.[4]
Nuclear spins in a molecular solid
In this architecture, quantum mechanical coupling between spins of neighboring atoms within the molecular solid provides the non-linearity required to create the higher-dimensional computational space.[5] The reservoir is then excited by radiofrequency electromagnetic radiation tuned to the resonance frequencies of relevant nuclear spins.[5] Readout occurs by measuring the nuclear spin states.[5]
Research initiatives
IEEE Task Force on Reservoir Computing
In 2018, the IEEE Task Force on Reservoir Computing has been established with the purpose of promoting and stimulating the development of Reservoir Computing research under both theoretical and application perspectives.
Physical reservoir computers
Fluidic reservoir computer[14]
Reservoir computer using coupled oscillators[15]
Reservoir computer using memristor[16]
Biological reservoir computer[1]
See also
References
- ^ a b c d e f g h i j Tanaka, Gouhei; Yamane, Toshiyuki; Héroux, Jean Benoit; Nakane, Ryosho; Kanazawa, Naoki; Takeda, Seiji; Numata, Hidetoshi; Nakano, Daiju; Hirose, Akira (2019). "Recent advances in physical reservoir computing: A review". Neural Networks. 115: 100–123. doi:10.1016/j.neunet.2019.03.005. ISSN 0893-6080. PMID 30981085.
- ^ Röhm, André; Lüdge, Kathy (2018-08-03). "Multiplexed networks: reservoir computing with virtual and real nodes". Journal of Physics: Communications. 2 (8): 085007. Bibcode:2018JPhCo...2h5007R. doi:10.1088/2399-6528/aad56d. ISSN 2399-6528.
- ^ a b c d e Schrauwen, Benjamin, David Verstraeten, and Jan Van Campenhout. "An overview of reservoir computing: theory, applications, and implementations." Proceedings of the European Symposium on Artificial Neural Networks ESANN 2007, pp. 471-482.
- ^ a b c d e Ghosh, Sanjib; Opala, Andrzej; Matuszewski, Michał; Paterek, Tomasz; Liew, Timothy C. H. (December 2019). "Quantum reservoir processing". NPJ Quantum Information. 5 (1): 35. arXiv:1811.10335. Bibcode:2019npjQI...5...35G. doi:10.1038/s41534-019-0149-8. ISSN 2056-6387.
- ^ a b c d e f g Negoro, Makoto; Mitarai, Kosuke; Fujii, Keisuke; Nakajima, Kohei; Kitagawa, Masahiro (2018-06-28). "Machine learning with controllable quantum dynamics of a nuclear spin ensemble in a solid". arXiv:1806.10910 [quant-ph].
- ^ a b c d e f g Soriano, Miguel C. (2017-02-06). "Viewpoint: Reservoir Computing Speeds Up". Physics. 10. doi:10.1103/Physics.10.12.
- ^ Kirby, Kevin. "Context dynamics in neural sequential learning." Proceedings of the Florida Artificial Intelligence Research Symposium FLAIRS (1991), 66-70.
- ^ Gallicchio, Claudio; Micheli, Alessio (2013). "Tree Echo State Networks". Neurocomputing. 101: 319–337. doi:10.1016/j.neucom.2012.08.017. hdl:11568/158480.
- ^ a b Crook, Nigel (2007). "Nonlinear Transient Computation". Neurocomputing. 70 (7–9): 1167–1176. doi:10.1016/j.neucom.2006.10.148.
- ^ Pedrelli, Luca (2019). Deep Reservoir Computing: A Novel Class of Deep Recurrent Neural Networks (PhD thesis). Università di Pisa.
- ^ Gallicchio, Claudio; Micheli, Alessio; Pedrelli, Luca (2017-12-13). "Deep reservoir computing: A critical experimental analysis". Neurocomputing. 268: 87–99. doi:10.1016/j.neucom.2016.12.089. hdl:11568/851934.
- ^ Gallicchio, Claudio; Micheli, Alessio (2017-05-05). "Echo State Property of Deep Reservoir Computing Networks". Cognitive Computation. 9 (3): 337–350. doi:10.1007/s12559-017-9461-9. hdl:11568/851932. ISSN 1866-9956.
- ^ Gallicchio, Claudio; Micheli, Alessio; Pedrelli, Luca (December 2018). "Design of deep echo state networks". Neural Networks. 108: 33–47. doi:10.1016/j.neunet.2018.08.002. ISSN 0893-6080. PMID 30138751.
- ^ Fernando, Chrisantha; Sojakka, Sampsa (2003), "Pattern Recognition in a Bucket", Advances in Artificial Life, Springer Berlin Heidelberg, pp. 588–597, doi:10.1007/978-3-540-39432-7_63, ISBN 9783540200574
- ^ Coulombe, Jean C.; York, Mark C. A.; Sylvestre, Julien (2017-06-02). "Computing with networks of nonlinear mechanical oscillators". PLOS One. 12 (6): e0178663. arXiv:1704.06320. Bibcode:2017PLoSO..1278663C. doi:10.1371/journal.pone.0178663. ISSN 1932-6203. PMC 5456098. PMID 28575018.
{{cite journal}}
: CS1 maint: unflagged free DOI (link) - ^ Du, Chao; Cai, Fuxi; Zidan, Mohammed A.; Ma, Wen; Lee, Seung Hwan; Lu, Wei D. (2017). "Reservoir computing using dynamic memristors for temporal information processing". Nature Communications. 8 (1): 2204. Bibcode:2017NatCo...8.2204D. doi:10.1038/s41467-017-02337-y. ISSN 2041-1723. PMC 5736649. PMID 29259188.
Further reading
- Reservoir Computing using delay systems, Nature Communications 2011
- Optoelectronic Reservoir Computing, Scientific Reports February 2012
- Optoelectronic Reservoir Computing, Optics Express 2012
- All-optical Reservoir Computing, Nature Communications 2013
- Memristor Models for Machine learning, Neural Computation 2014 arxiv