Paul Werbos

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Paul J. Werbos (born 1947) is a scientist best known for his 1974 Harvard University Ph.D. thesis, which first described the process of training artificial neural networks through backpropagation of errors.[1] The thesis, and some supplementary information, can be found in his book, The Roots of Backpropagation (ISBN 0-471-59897-6). He also was a pioneer of recurrent neural networks.[2]

Werbos was one of the original three two-year Presidents of the International Neural Network Society (INNS). He was awarded the IEEE Neural Network Pioneer Award for the discovery of backpropagation and other basic neural network learning frameworks such as Adaptive Dynamic Programming.

Werbos has also written on quantum mechanics and other areas of physics.[3][4] He also has interest in larger questions relating to consciousness, the foundations of physics, and human potential. Roger Penrose discusses some of these ideas in his book Shadows of the Mind.

He served as program director in the National Science Foundation for several years until 2015.

References[edit]

  1. ^ Werbos, P.. Beyond Regression: New Tools for Prediction and Analysis in the Behavioral Sciences. PhD thesis, Harvard University, 1974
  2. ^ Werbos, P.. Backpropagation through time: what it does and how to do it. Proceedings of the IEEE, Volume 78, Issue 10, 1550 - 1560, Oct 1990, doi10.1109/5.58337
  3. ^ Werbos, P. A Conjecture About Fermi-Bose Equivalence http://arxiv.org/pdf/hep-th/0505023.pdf
  4. ^ http://stardrive.org/stardrive/index.php/all-blog-articles/8986-

External links[edit]