Michael I. Jordan

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
Michael I. Jordan
Michael I Jordan.png
Born (1956-02-25) February 25, 1956 (age 62)
Residence Berkeley, California
Alma mater University of California, San Diego
Known for Latent Dirichlet allocation
Awards Fellow of the U.S. National Academy of Sciences[1]
AAAI Fellow (2002)
Rumelhart Prize (2015) [2]
IJCAI Award for Research Excellence (2016)
Scientific career
Institutions University of California, Berkeley
University of California, San Diego
Massachusetts Institute of Technology
Thesis The Learning of Representations for Sequential Performance (1985)
Doctoral advisor David Rumelhart
Donald Norman
Doctoral students
Other notable students
Website www.cs.berkeley.edu/~jordan

Michael Irwin Jordan is an American scientist, professor at the University of California, Berkeley and researcher in machine learning, statistics, and artificial intelligence.[3][4][5] He is one of the leading figures in machine learning, and in 2016 Science reported him as the world's most influential computer scientist.[6][7][8][9][10][11]

Biography[edit]

Jordan received his BS magna cum laude in Psychology in 1978 from the Louisiana State University, his MS in Mathematics in 1980 from Arizona State University and his PhD in Cognitive Science in 1985 from the University of California, San Diego.[12] At the University of California, San Diego, Jordan was a student of David Rumelhart and a member of the PDP Group in the 1980s.

Jordan is currently a full professor at the University of California, Berkeley where his appointment is split across the Department of Statistics and the Department of EECS. He was a professor at the Department of Brain and Cognitive Sciences at MIT from 1988 to 1998.[12]

Work[edit]

In the 1980s Jordan started developing recurrent neural networks as a cognitive model. In recent years, his work is less driven from a cognitive perspective and more from the background of traditional statistics.

Jordan popularised Bayesian networks in the machine learning community and is known for pointing out links between machine learning and statistics. He was also prominent in the formalisation of variational methods for approximate inference[1] and the popularisation of the expectation-maximization algorithm[13] in machine learning.

Resignation from Machine Learning[edit]

In 2001, Jordan and others resigned from the editorial board of the journal Machine Learning. In a public letter, they argued for less restrictive access and pledged support for a new open access journal, the Journal of Machine Learning Research, which was created by Leslie Kaelbling to support the evolution of the field of machine learning.[14]

Honors and awards[edit]

Jordan has received numerous awards, including a best student paper award[15] (with X. Nguyen and M. Wainwright) at the International Conference on Machine Learning (ICML 2004), a best paper award (with R. Jacobs) at the American Control Conference (ACC 1991), the ACM - AAAI Allen Newell Award, the IEEE Neural Networks Pioneer Award, and an NSF Presidential Young Investigator Award. In 2010 he was named a Fellow of the Association for Computing Machinery "for contributions to the theory and application of machine learning."[16]

Jordan is a member of the National Academy of Science, a member of the National Academy of Engineering and a member of the American Academy of Arts and Sciences.

He has been named a Neyman Lecturer and a Medallion Lecturer by the Institute of Mathematical Statistics. He received the David E. Rumelhart Prize in 2015 and the ACM/AAAI Allen Newell Award in 2009.

In 2016, Jordan was identified as the "most influential computer scientist", based on an analysis of the published literature by the Semantic Scholar project.[17]

References[edit]

  1. ^ a b Zeliadt, N. (2013). "Profile of Michael I. Jordan". Proceedings of the National Academy of Sciences. 110 (4): 1141–1143. doi:10.1073/pnas.1222664110. PMC 3557047Freely accessible. PMID 23341554. 
  2. ^ Bio highlights of Prof. MI Jordan
  3. ^ Jacobs, R. A.; Jordan, M. I.; Nowlan, S. J.; Hinton, G. E. (1991). "Adaptive Mixtures of Local Experts". Neural Computation. 3: 79. doi:10.1162/neco.1991.3.1.79. 
  4. ^ David M. Blei, Andrew Y. Ng, Michael I. Jordan. Latent Dirichlet allocation. The Journal of Machine Learning Research, Volume 3, 3/1/2003
  5. ^ Michael I. Jordan, ed. Learning in Graphical Models. Proceedings of the NATO Advanced Study Institute, Ettore Maiorana Centre, Erice, Italy, September 27-October 7, 1996
  6. ^ "Who's the Michael Jordan of computer science? New tool ranks researchers' influence". Science | AAAS. 2016-04-19. Retrieved 2018-03-28. 
  7. ^ "Top 50 authors in computer science" (PDF). Science. 
  8. ^ ""Who is the Michael Jordan of computer science?"". Berkeley Engineering. 2016-11-01. Retrieved 2018-03-28. 
  9. ^ Austria, IST. "IST Austria: Lecture by Michael I. Jordan available on IST Austria's YouTube channel". ist.ac.at. Retrieved 2018-03-28. 
  10. ^ "Who's the Michael Jordan of Computer Science? New Tool Ranks Researchers' Influence | Careers | Communications of the ACM". cacm.acm.org. Retrieved 2018-03-28. 
  11. ^ "Michael I. Jordan". awards.acm.org. Retrieved 2018-03-28. 
  12. ^ a b Vitae MICHAEL I. JORDAN Department of Electrical Engineering and Computer Science, Department of Statistics, University of California. Accessed September 3, 2013
  13. ^ "Hierarchical mixtures of experts and the EM algorithm". Neural Computation. 1994. doi:10.1109/IJCNN.1993.716791. Retrieved 2015-12-19. 
  14. ^ Editorial Board of the Kluwer Journal, Machine Learning: Resignation Letter (2001)
  15. ^ "Long Nguyen's Publications". Stat.lsa.umich.edu. Retrieved 2012-05-21. 
  16. ^ "ACM Names 41 Fellows from World's Leading Institutions — Association for Computing Machinery". Acm.org. Archived from the original on 2012-04-28. Retrieved 2012-05-21. 
  17. ^ "Who's the Michael Jordan of computer science? New tool ranks researchers' influence". Science | AAAS. 2016-04-19. Retrieved 2016-12-09. 

External links[edit]