Jump to content

ADALINE

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Bender the Bot (talk | contribs) at 01:26, 27 November 2016 (top: clean up; http→https for YouTube using AWB). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Learning inside a single layer ADALINE

ADALINE (Adaptive Linear Neuron or later Adaptive Linear Element) is an early single-layer artificial neural network and the name of the physical device that implemented this network.[1][2][3][4][5] The network uses memistors. It was developed by Professor Bernard Widrow and his graduate student Ted Hoff at Stanford University in 1960. It is based on the McCulloch–Pitts neuron. It consists of a weight, a bias and a summation function.

The difference between Adaline and the standard (McCulloch–Pitts) perceptron is that in the learning phase the weights are adjusted according to the weighted sum of the inputs (the net). In the standard perceptron, the net is passed to the activation (transfer) function and the function's output is used for adjusting the weights.

There also exists an extension known as Madaline.

Definition

Adaline is a single layer neural network with multiple nodes where each node accepts multiple inputs and generates one output. Given the following variables:as

  • x is the input vector
  • w is the weight vector
  • n is the number of inputs
  • some constant
  • y is the output of the model

then we find that the output is . If we further assume that

then the output further reduces to the dot product of x and w:

Learning algorithm

Let us assume:

  • is the learning rate (some positive constant)
  • is the output of the model
  • is the target (desired) output

then the weights are updated as follows . The ADALINE converges to the least squares error which is .[6] This update rule is in fact the stochastic gradient descent update for linear regression.[7]

References

  1. ^ Talking Nets: An Oral History of Neural Networks.
  2. ^ Youtube: widrowlms: Science in Action
  3. ^ 1960: An adaptive "ADALINE" neuron using chemical "memistors"
  4. ^ Youtube: widrowlms: The LMS algorithm and ADALINE. Part I - The LMS algorithm
  5. ^ Youtube: widrowlms: The LMS algorithm and ADALINE. Part II - ADALINE and memistor ADALINE
  6. ^ "Adaline (Adaptive Linear)" (PDF). CS 4793: Introduction to Artificial Neural Networks. Department of Computer Science, University of Texas at San Antonio.
  7. ^ Avi Pfeffer. "CS181 Lecture 5 — Perceptrons" (PDF). Harvard University.