Jump to content

Echo state network

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 76.14.4.74 (talk) at 22:09, 23 October 2016 (I changed the wording of the second paragraph in the page because although only the weights between the reservoir and output neurons are modified during training, it is arguable that the network itself contains more parameters (such as reservoir size).). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

The echo state network (ESN) is a recurrent neural network with a sparsely connected hidden layer (with typically 1% connectivity). The connectivity and weights of hidden neurons are fixed and randomly assigned. The weights of output neurons can be learned so that the network can (re)produce specific temporal patterns.

The main interest of this network is that although its behaviour is non-linear, the only weights that are modified during training are for the synapses that connect the hidden neurons to output neurons. Thus, the error function is quadratic with respect to the parameter vector and can be differentiated easily to a linear system.

See also

References

  • Herbert Jaeger and Harald Haas. Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication. Science 2 April 2004: Vol. 304. no. 5667, pp. 78 – 80 doi:10.1126/science.1091277PDF (preprint)
  • Herbert Jaeger (2007) Echo State Network. Scholarpedia.