Stuart Geman

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Stuart Geman
Born Chicago, Illinois
Nationality American
Fields Mathematics
Institutions Brown University
Alma mater

University of Michigan B.S. (1971)

Dartmouth College M.S. (1973)
Massachusetts Institute of Technology Ph.D. (1977)
Doctoral advisor Herman Chernoff
Frank Kozin

Stuart Geman is an American mathematician, known for influential contributions to computer vision, statistics, probability theory, machine learning, and the neurosciences.[1][2][3][4]

Biography[edit]

Geman was born and raised in Chicago. He was educated at the University of Michigan (B.S., Physics, 1971), Dartmouth Medical College (MS, Neurophysiology, 1973), and the Massachusetts Institute of Technology (Ph.D, Applied Mathematics, 1977).

Since 1977, he has been a member of the faculty at Brown University, where he has worked in the Pattern Theory group, and is currently the James Manning Professor of Applied Mathematics. He has received many honors and awards, including selection as a Presidential Young Investigator and as an ISI Highly Cited researcher. He is an elected member of the International Statistical Institute, and a fellow of the Institute of Mathematical Statistics and of the American Mathematical Society.[5] He was elected to the US National Academy of Sciences in 2011.

Work[edit]

Geman’s scientific contributions span work in probabilistic and statistical approaches to artificial intelligence, Markov random fields, Markov Chain Monte Carlo (MCMC) methods, nonparametric inference, random matrices, random dynamical systems, neural networks, neurophysiology, financial markets, and natural image statistics. Particularly notable works include: the development of the Gibbs sampler, proof of convergence of simulated annealing,[6][7] foundational contributions to the Markov random field (“graphical model”) approach to inference in vision and machine learning,[3][8] and work on the compositional foundations of vision and cognition.[9][10]

Notes[edit]

  1. ^ Thomas P. Ryan and William H. Woodall (2005). "The Most-Cited Statistical Papers". Journal of Applied Statistics 32 (5): 461–474. doi:10.1080/02664760500079373. 
  2. ^ S. Kotz and N.L. Johnson (1997). Breakthroughs in Statistics, Volume III. New York, NY: Springer Verlag. 
  3. ^ a b [Wikipedia] List of important publications in computer science.
  4. ^ Sharon Bertsch Mcgrayne (2011). The theory that would not die. New York and London: Yale University Press. 
  5. ^ List of Fellows of the American Mathematical Society, retrieved 2013-08-27.
  6. ^ P.J. van Laarhoven and E.H. Aarts (1987). Simulated annealing: Theory and applications. Netherlands: Kluwer. 
  7. ^ P. Salamon, P. Sibani, R. Frost (2002). Facts, Conjectures, and Improvements for Simulated Annealing. Philadelphia, PA: Society for Industrial and Applied Mathematics. 
  8. ^ C. Bishop (2006). Pattern recognition and machine learning. New York: Springer. 
  9. ^ N. Chater, J.B. Tenenbaum, and A. Yuille (2005). "Probabilistic models of cognition: Conceptual foundations". Trends in cognitive sciences 10 (7): 287–291. doi:10.1016/j.tics.2006.05.007. 
  10. ^ B. Ommer and J.M. Buhmann (2010). "Learning the compositional structure of visual object categories for recognition". Pattern analysis and machine intelligence 32 (3): 501–516. doi:10.1109/tpami.2009.22.