Bernhard Schölkopf: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
No edit summary
Filled in 17 bare reference(s) with reFill 2
Line 14: Line 14:
* [[Causal Inference]]
* [[Causal Inference]]
}}
}}
| awards = <br />[[BBVA Foundation Frontiers of Knowledge Awards]] (2020)<br/>Koerber European Science Prize (2019) <br />Causality in Statistics Education Award, American Statistical Association<ref>https://www.amstat.org/ASA/Your-Career/Awards/Causality-in-Statistics-Education-Award.aspx</ref>
| awards = <br />[[BBVA Foundation Frontiers of Knowledge Awards]] (2020)<br/>Koerber European Science Prize (2019) <br />Causality in Statistics Education Award, American Statistical Association<ref>{{Cite web|url=https://www.amstat.org/ASA/Your-Career/Awards/Causality-in-Statistics-Education-Award.aspx|title=Causality in Statistics Education Award|website=www.amstat.org}}</ref>
<br />Gottfried-Wilhelm-Leibniz-Preis (2018)<br />Fellow of the ACM (Association for Computing Machinery) (2018)<br />Member of the German National Academy of Science (Leopoldina) (2017)<br />Royal Society Milner Award (2014)<br />Academy Prize of the Berlin-Brandenburg Academy of Sciences and Humanities (2012)<br />/[[Max_Planck_Society#Max_Planck_Research_Award|Max Planck Research Award 2011]]<br />J. K. Aggarwal Prize of the International Association for Pattern Recognition (2006)<br>
<br />Gottfried-Wilhelm-Leibniz-Preis (2018)<br />Fellow of the ACM (Association for Computing Machinery) (2018)<br />Member of the German National Academy of Science (Leopoldina) (2017)<br />Royal Society Milner Award (2014)<br />Academy Prize of the Berlin-Brandenburg Academy of Sciences and Humanities (2012)<br />/[[Max_Planck_Society#Max_Planck_Research_Award|Max Planck Research Award 2011]]<br />J. K. Aggarwal Prize of the International Association for Pattern Recognition (2006)<br>


Line 26: Line 26:
==Research==
==Research==
===Kernel methods===
===Kernel methods===
Schölkopf developed SVM methods achieving world record performance on the MNIST pattern recognition benchmark at the time <ref>https://link.springer.com/article/10.1023/A:1012454411458</ref>. With the introduction of [[kernel PCA]], Schölkopf and coauthors argued that SVMs are a special case of a much larger class of methods, and all algorithms that can be expressed in terms of dot products can be generalized to a nonlinear setting by means of what is known as reproducing kernels. <ref>https://direct.mit.edu/neco/article/10/5/1299/6193/Nonlinear-Component-Analysis-as-a-Kernel, https://ei.is.tuebingen.mpg.de/publications/5634</ref><ref>https://link.springer.com/article/10.1023/A:1009715923555</ref> Another significant observation was that the data on which the kernel is defined need not be vectorial, as long as the kernel [[Gram matrix]]is positive definite. <ref>https://ei.is.tuebingen.mpg.de/publications/5634</ref> Both insights together led to the foundation of the field of [[kernel methods]], encompassing SVMs and many other algorithms. Kernel methods are now textbook knowledge and one of the major machine learning paradigms in research and applications.
Schölkopf developed SVM methods achieving world record performance on the MNIST pattern recognition benchmark at the time <ref>{{Cite journal|url=https://doi.org/10.1023/A:1012454411458|title=Training Invariant Support Vector Machines|first1=Dennis|last1=Decoste|first2=Bernhard|last2=Schölkopf|date=January 1, 2002|journal=Machine Learning|volume=46|issue=1|pages=161–190|via=Springer Link|doi=10.1023/A:1012454411458}}</ref>. With the introduction of [[kernel PCA]], Schölkopf and coauthors argued that SVMs are a special case of a much larger class of methods, and all algorithms that can be expressed in terms of dot products can be generalized to a nonlinear setting by means of what is known as reproducing kernels. <ref>https://direct.mit.edu/neco/article/10/5/1299/6193/Nonlinear-Component-Analysis-as-a-Kernel, https://ei.is.tuebingen.mpg.de/publications/5634</ref><ref>{{Cite journal|url=https://doi.org/10.1023/A:1009715923555|title=A Tutorial on Support Vector Machines for Pattern Recognition|first=Christopher J.C.|last=Burges|date=June 1, 1998|journal=Data Mining and Knowledge Discovery|volume=2|issue=2|pages=121–167|via=Springer Link|doi=10.1023/A:1009715923555}}</ref> Another significant observation was that the data on which the kernel is defined need not be vectorial, as long as the kernel [[Gram matrix]]is positive definite. <ref>{{Cite web|url=https://is.mpg.de/|title=Support vector learning &#124; Empirical Inference|first=Jon|last=Williams|website=Max Planck Institute for Intelligent Systems}}</ref> Both insights together led to the foundation of the field of [[kernel methods]], encompassing SVMs and many other algorithms. Kernel methods are now textbook knowledge and one of the major machine learning paradigms in research and applications.


Developing kernel PCA, Schölkopf extended it to extract invariant features and to design invariant kernels, and showed how to view other major dimensionality reduction methods such as LLE and Isomap as special cases. In further work with Alex Smola and others, he extended the SVM method to regression and classification with pre-specified sparsity and quantile/support estimation. He proved a [[representer theorem]] implying that SVMs, kernel PCA, and most other kernel algorithms, regularized by a norm in a [[reproducing kernel Hilbert space]], have solutions taking the form of kernel expansions on the training data, thus reducing an infinite dimensional optimization problem to a finite dimensional one. He co-developed kernel embeddings of distributions methods to represent probability distributions in Hilbert Spaces, with links to [[Fraunhofer diffraction]] as well as applications to independence testing.
Developing kernel PCA, Schölkopf extended it to extract invariant features and to design invariant kernels, and showed how to view other major dimensionality reduction methods such as LLE and Isomap as special cases. In further work with Alex Smola and others, he extended the SVM method to regression and classification with pre-specified sparsity and quantile/support estimation. He proved a [[representer theorem]] implying that SVMs, kernel PCA, and most other kernel algorithms, regularized by a norm in a [[reproducing kernel Hilbert space]], have solutions taking the form of kernel expansions on the training data, thus reducing an infinite dimensional optimization problem to a finite dimensional one. He co-developed kernel embeddings of distributions methods to represent probability distributions in Hilbert Spaces, with links to [[Fraunhofer diffraction]] as well as applications to independence testing.
Line 34: Line 34:
Starting in 2005, Schölkopf turned his attention to [[causal inference]]. Causal mechanisms in the world give rise to statistical dependencies as epiphenomena, but only the latter are exploited by popular machine learning algorithms. Knowledge about causal structures and mechanisms is useful by letting us predict not only future data coming from the same source, but also the effect of interventions in a system, and by facilitating transfer of detected regularities to new situations.
Starting in 2005, Schölkopf turned his attention to [[causal inference]]. Causal mechanisms in the world give rise to statistical dependencies as epiphenomena, but only the latter are exploited by popular machine learning algorithms. Knowledge about causal structures and mechanisms is useful by letting us predict not only future data coming from the same source, but also the effect of interventions in a system, and by facilitating transfer of detected regularities to new situations.


Schölkopf and co-workers addressed (and in certain settings solved) the problem of causal discovery for the two-variable and connected causality to [[Kolmogorov complexity]] <ref>https://ieeexplore.ieee.org/document/5571886</ref>.
Schölkopf and co-workers addressed (and in certain settings solved) the problem of causal discovery for the two-variable and connected causality to [[Kolmogorov complexity]] <ref>{{Cite journal|url=https://ieeexplore.ieee.org/document/5571886|title=Causal Inference Using the Algorithmic Markov Condition|first1=Dominik|last1=Janzing|first2=Bernhard|last2=Schölkopf|date=October 6, 2010|journal=IEEE Transactions on Information Theory|volume=56|issue=10|pages=5168–5194|via=IEEE Xplore|doi=10.1109/TIT.2010.2060095}}</ref>.


Around 2010, Schölkopf began to explore how to use causality for machine learning, exploiting assumptions of independence of mechanisms and invariance <ref>https://icml.cc/2012/papers/625.pdf</ref>. His early work on [[causal learning]] was exposed to a wider machine learning audience during his Posner lecture <ref>http://videolectures.net/nips2011_scholkopf_inference/</ref> at [[NeurIPS]] 2011, as well as in a keynote talk at [[ICML] 2017.<ref>https://vimeo.com/238274659</ref>
Around 2010, Schölkopf began to explore how to use causality for machine learning, exploiting assumptions of independence of mechanisms and invariance <ref>https://icml.cc/2012/papers/625.pdf</ref>. His early work on [[causal learning]] was exposed to a wider machine learning audience during his Posner lecture <ref>{{Cite web|url=http://videolectures.net/nips2011_scholkopf_inference/|title=From kernels to causal inference|website=videolectures.net}}</ref> at [[NeurIPS]] 2011, as well as in a keynote talk at [[ICML] 2017.<ref>{{Cite web|url=https://vimeo.com/238274659|title=Causal Learning --- Bernhard Sch&ouml;lkopf|date=October 15, 2017|via=Vimeo}}</ref>
He assayed how to exploit underlying causal structures in order to make machine learning methods more robust with respect to distribution shifts. <ref>https://www.nature.com/articles/518486a</ref> and systematic errors <ref>https://www.pnas.org/content/113/27/7391</ref>, the latter leading to the discovery of a number of new exoplanets including [[K2-18b]], which was subsequently found to contain water vapour in its atmosphere, a first for an exoplanet in the [[habitable zone]].
He assayed how to exploit underlying causal structures in order to make machine learning methods more robust with respect to distribution shifts. <ref>{{Cite journal|url=https://www.nature.com/articles/518486a|title=Learning to see and act|first=Bernhard|last=Schölkopf|date=February 6, 2015|journal=Nature|volume=518|issue=7540|pages=486–487|via=www.nature.com|doi=10.1038/518486a}}</ref> and systematic errors <ref>{{Cite journal|url=https://www.pnas.org/content/113/27/7391|title=Modeling confounding by half-sibling regression|first1=Bernhard|last1=Schölkopf|first2=David W.|last2=Hogg|first3=Dun|last3=Wang|first4=Daniel|last4=Foreman-Mackey|first5=Dominik|last5=Janzing|first6=Carl-Johann|last6=Simon-Gabriel|first7=Jonas|last7=Peters|date=July 5, 2016|journal=Proceedings of the National Academy of Sciences|volume=113|issue=27|pages=7391–7398}}</ref>, the latter leading to the discovery of a number of new exoplanets including [[K2-18b]], which was subsequently found to contain water vapour in its atmosphere, a first for an exoplanet in the [[habitable zone]].


==Education and employment==
==Education and employment==


Schölkopf studied mathematics, physics, and philosophy in Tübingen and London. He was supported by the [[Studienstiftung]] and won the Lionel Cooper Memorial Prize for the best M.Sc. in Mathematics at the [[University of London]]. <ref>https://www.leopoldina.org/fileadmin/redaktion/Mitglieder/CV_Schoelkopf_Bernhard_D.pdf</ref>. He completed a [[Diplom]] in Physics, and then moved to [[Bell Labs]] in New Jersey, where he worked with [[Vladimir Vapnik]] who became co-adviser of his PhD thesis at the TU Berlin (with Stefan Jähnichen). His thesis, defended in 1997, won the annual award of the German Informatics Association.<ref>https://archiv.pressestelle.tu-berlin.de/pi/1998/pi209.htm</ref> In 2001, following positions in Berlin, Cambridge and New York, he founded the Department for Empirical Inference at the [[Max Planck Institute for Biological Cybernetics]], which grew into a leading center for research in machine learning. In 2011, he became founding director at the [[Max Planck Institute for Intelligent Systems]].<ref>https://www.kyb.tuebingen.mpg.de/60587/history</ref> <ref>https://www.mpg.de/4379702/MPR_2011_2.pdf</ref>
Schölkopf studied mathematics, physics, and philosophy in Tübingen and London. He was supported by the [[Studienstiftung]] and won the Lionel Cooper Memorial Prize for the best M.Sc. in Mathematics at the [[University of London]]. <ref>https://www.leopoldina.org/fileadmin/redaktion/Mitglieder/CV_Schoelkopf_Bernhard_D.pdf</ref>. He completed a [[Diplom]] in Physics, and then moved to [[Bell Labs]] in New Jersey, where he worked with [[Vladimir Vapnik]] who became co-adviser of his PhD thesis at the TU Berlin (with Stefan Jähnichen). His thesis, defended in 1997, won the annual award of the German Informatics Association.<ref>{{Cite web|url=https://archiv.pressestelle.tu-berlin.de/pi/1998/pi209.htm|title=TU Berlin - Medieninformation Nr. 209 - 17. September 1998|website=archiv.pressestelle.tu-berlin.de}}</ref> In 2001, following positions in Berlin, Cambridge and New York, he founded the Department for Empirical Inference at the [[Max Planck Institute for Biological Cybernetics]], which grew into a leading center for research in machine learning. In 2011, he became founding director at the [[Max Planck Institute for Intelligent Systems]].<ref>{{Cite web|url=https://www.kyb.tuebingen.mpg.de/60587/history|title=History of the Institute|website=www.kyb.tuebingen.mpg.de}}</ref> <ref>https://www.mpg.de/4379702/MPR_2011_2.pdf</ref>


With Alex Smola, Schölkopf co-founded the series of Machine Learning Summer Schools <ref>http://mlss.cc/</ref>. He also co-founded a Cambridge-Tübingen PhD Programme <ref>http://mlg.eng.cam.ac.uk/?page_id=659</ref> and the Max Planck-ETH Center for Learning Systems <ref>https://learning-systems.org/</ref>. In 2016, he co-founded the Cyber Valley research consortium <ref>https://stm.baden-wuerttemberg.de/de/service/media/mid/europaeische-forschungskooperation-cyber-valley-startet-1/</ref>. He participated in the IEEE Global Initiative on “Ethically Aligned Design” <ref>https://standards.ieee.org/content/dam/ieee-standards/standards/web/documents/other/ead_v1.pdf</ref>.
With Alex Smola, Schölkopf co-founded the series of Machine Learning Summer Schools <ref>{{Cite web|url=http://mlss.cc/|title=Machine Learning Summer Schools - MLSS|website=mlss.cc}}</ref>. He also co-founded a Cambridge-Tübingen PhD Programme <ref>{{Cite web|url=http://mlg.eng.cam.ac.uk/?page_id=659|title=Cambridge Machine Learning Group &#124; PhD Programme in Advanced Machine Learning}}</ref> and the Max Planck-ETH Center for Learning Systems <ref>{{Cite web|url=https://cls-staging.is.localnet/|title=Max Planck ETH Center for Learning Systems|first=Jonathan|last=Williams|website=cls-staging.is.localnet}}</ref>. In 2016, he co-founded the Cyber Valley research consortium <ref>{{Cite web|url=https://stm.baden-wuerttemberg.de/de/service/media/mid/europaeische-forschungskooperation-cyber-valley-startet-1/|title=Service|website=Baden-Württemberg.de}}</ref>. He participated in the IEEE Global Initiative on “Ethically Aligned Design” <ref>https://standards.ieee.org/content/dam/ieee-standards/standards/web/documents/other/ead_v1.pdf</ref>.


Schölkopf is co-editor-in-Chief of the [[Journal of Machine Learning Research]], a journal he helped found being part of a mass resignation of the editorial board of [[Machine Learning (journal)]]. He is among the world’s most cited computer scientists <ref>https://www.guide2research.com/scientists/</ref>. Alumni of his lab include Ulrike von Luxburg, Carl Rasmussen, Matthias Hein, Arthur Gretton, Gunnar Rätsch, Matthias Bethge, Stefanie Jegelka, Jason Weston, Olivier Bousquet, Olivier Chapelle, Joaquin Quinonero-Candela, and Sebastian Nowozin <ref>http://people.tuebingen.mpg.de/bs/alumni.htm</ref>.
Schölkopf is co-editor-in-Chief of the [[Journal of Machine Learning Research]], a journal he helped found being part of a mass resignation of the editorial board of [[Machine Learning (journal)]]. He is among the world’s most cited computer scientists <ref>{{Cite web|url=https://www.guide2research.com/scientists/|title=World’s Top Computer Scientists: H-Index Computer Science Ranking|website=www.guide2research.com}}</ref>. Alumni of his lab include Ulrike von Luxburg, Carl Rasmussen, Matthias Hein, Arthur Gretton, Gunnar Rätsch, Matthias Bethge, Stefanie Jegelka, Jason Weston, Olivier Bousquet, Olivier Chapelle, Joaquin Quinonero-Candela, and Sebastian Nowozin <ref>http://people.tuebingen.mpg.de/bs/alumni.htm</ref>.


==Awards==
==Awards==


Schölkopf’s awards include the [[Royal Society Milner Award]] and, shared with Isabelle Guyon and Vladimir Vapnik, the [[BBVA Foundation Frontiers of Knowledge Award]] in the Information and Communication Technologies category (a first for a scientist working in Europe).<ref>https://ei.is.tuebingen.mpg.de/news/bernhard-scholkopf-receives-frontiers-of-knowledge-award</ref>
Schölkopf’s awards include the [[Royal Society Milner Award]] and, shared with Isabelle Guyon and Vladimir Vapnik, the [[BBVA Foundation Frontiers of Knowledge Award]] in the Information and Communication Technologies category (a first for a scientist working in Europe).<ref>{{Cite web|url=https://ei.is.tuebingen.mpg.de/news/bernhard-scholkopf-receives-frontiers-of-knowledge-award|title=Bernhard Schölkopf receives Frontiers of Knowledge Award &#124; Empirical Inference|first=Jon|last=Williams|website=Max Planck Institute for Intelligent Systems}}</ref>


==References==
==References==

Revision as of 11:43, 6 August 2021

Bernhard Schölkopf
File:GWL 2018 Schölkopf 1433 DavidAusserhofer.jpg
Bernhard Schölkopf in 2018
BornFebruary1968 (age 55–56)
Alma mater
Known for
Awards
BBVA Foundation Frontiers of Knowledge Awards (2020)
Koerber European Science Prize (2019)
Causality in Statistics Education Award, American Statistical Association[1]
Gottfried-Wilhelm-Leibniz-Preis (2018)
Fellow of the ACM (Association for Computing Machinery) (2018)
Member of the German National Academy of Science (Leopoldina) (2017)
Royal Society Milner Award (2014)
Academy Prize of the Berlin-Brandenburg Academy of Sciences and Humanities (2012)
/Max Planck Research Award 2011
J. K. Aggarwal Prize of the International Association for Pattern Recognition (2006)
Scientific career
InstitutionsMax Planck Institute for Intelligent Systems

Bernhard Schölkopf is a German computer scientist (born February 20, 1968) known for his work in machine learning, especially on kernel methods and causality. He is a director at the Max Planck Institute for Intelligent Systems in Tübingen, Germany, where he heads the Department of Empirical Inference. He is also an affiliated professor at ETH Zürich, honorary professor at the University of Tübingen and the Technical University Berlin, and chairman of the European Laboratory for Learning and Intelligent Systems (ELLIS).

Research

Kernel methods

Schölkopf developed SVM methods achieving world record performance on the MNIST pattern recognition benchmark at the time [2]. With the introduction of kernel PCA, Schölkopf and coauthors argued that SVMs are a special case of a much larger class of methods, and all algorithms that can be expressed in terms of dot products can be generalized to a nonlinear setting by means of what is known as reproducing kernels. [3][4] Another significant observation was that the data on which the kernel is defined need not be vectorial, as long as the kernel Gram matrixis positive definite. [5] Both insights together led to the foundation of the field of kernel methods, encompassing SVMs and many other algorithms. Kernel methods are now textbook knowledge and one of the major machine learning paradigms in research and applications.

Developing kernel PCA, Schölkopf extended it to extract invariant features and to design invariant kernels, and showed how to view other major dimensionality reduction methods such as LLE and Isomap as special cases. In further work with Alex Smola and others, he extended the SVM method to regression and classification with pre-specified sparsity and quantile/support estimation. He proved a representer theorem implying that SVMs, kernel PCA, and most other kernel algorithms, regularized by a norm in a reproducing kernel Hilbert space, have solutions taking the form of kernel expansions on the training data, thus reducing an infinite dimensional optimization problem to a finite dimensional one. He co-developed kernel embeddings of distributions methods to represent probability distributions in Hilbert Spaces, with links to Fraunhofer diffraction as well as applications to independence testing.

Causality

Starting in 2005, Schölkopf turned his attention to causal inference. Causal mechanisms in the world give rise to statistical dependencies as epiphenomena, but only the latter are exploited by popular machine learning algorithms. Knowledge about causal structures and mechanisms is useful by letting us predict not only future data coming from the same source, but also the effect of interventions in a system, and by facilitating transfer of detected regularities to new situations.

Schölkopf and co-workers addressed (and in certain settings solved) the problem of causal discovery for the two-variable and connected causality to Kolmogorov complexity [6].

Around 2010, Schölkopf began to explore how to use causality for machine learning, exploiting assumptions of independence of mechanisms and invariance [7]. His early work on causal learning was exposed to a wider machine learning audience during his Posner lecture [8] at NeurIPS 2011, as well as in a keynote talk at [[ICML] 2017.[9] He assayed how to exploit underlying causal structures in order to make machine learning methods more robust with respect to distribution shifts. [10] and systematic errors [11], the latter leading to the discovery of a number of new exoplanets including K2-18b, which was subsequently found to contain water vapour in its atmosphere, a first for an exoplanet in the habitable zone.

Education and employment

Schölkopf studied mathematics, physics, and philosophy in Tübingen and London. He was supported by the Studienstiftung and won the Lionel Cooper Memorial Prize for the best M.Sc. in Mathematics at the University of London. [12]. He completed a Diplom in Physics, and then moved to Bell Labs in New Jersey, where he worked with Vladimir Vapnik who became co-adviser of his PhD thesis at the TU Berlin (with Stefan Jähnichen). His thesis, defended in 1997, won the annual award of the German Informatics Association.[13] In 2001, following positions in Berlin, Cambridge and New York, he founded the Department for Empirical Inference at the Max Planck Institute for Biological Cybernetics, which grew into a leading center for research in machine learning. In 2011, he became founding director at the Max Planck Institute for Intelligent Systems.[14] [15]

With Alex Smola, Schölkopf co-founded the series of Machine Learning Summer Schools [16]. He also co-founded a Cambridge-Tübingen PhD Programme [17] and the Max Planck-ETH Center for Learning Systems [18]. In 2016, he co-founded the Cyber Valley research consortium [19]. He participated in the IEEE Global Initiative on “Ethically Aligned Design” [20].

Schölkopf is co-editor-in-Chief of the Journal of Machine Learning Research, a journal he helped found being part of a mass resignation of the editorial board of Machine Learning (journal). He is among the world’s most cited computer scientists [21]. Alumni of his lab include Ulrike von Luxburg, Carl Rasmussen, Matthias Hein, Arthur Gretton, Gunnar Rätsch, Matthias Bethge, Stefanie Jegelka, Jason Weston, Olivier Bousquet, Olivier Chapelle, Joaquin Quinonero-Candela, and Sebastian Nowozin [22].

Awards

Schölkopf’s awards include the Royal Society Milner Award and, shared with Isabelle Guyon and Vladimir Vapnik, the BBVA Foundation Frontiers of Knowledge Award in the Information and Communication Technologies category (a first for a scientist working in Europe).[23]

References

  1. ^ "Causality in Statistics Education Award". www.amstat.org.
  2. ^ Decoste, Dennis; Schölkopf, Bernhard (January 1, 2002). "Training Invariant Support Vector Machines". Machine Learning. 46 (1): 161–190. doi:10.1023/A:1012454411458 – via Springer Link.
  3. ^ https://direct.mit.edu/neco/article/10/5/1299/6193/Nonlinear-Component-Analysis-as-a-Kernel, https://ei.is.tuebingen.mpg.de/publications/5634
  4. ^ Burges, Christopher J.C. (June 1, 1998). "A Tutorial on Support Vector Machines for Pattern Recognition". Data Mining and Knowledge Discovery. 2 (2): 121–167. doi:10.1023/A:1009715923555 – via Springer Link.
  5. ^ Williams, Jon. "Support vector learning | Empirical Inference". Max Planck Institute for Intelligent Systems.
  6. ^ Janzing, Dominik; Schölkopf, Bernhard (October 6, 2010). "Causal Inference Using the Algorithmic Markov Condition". IEEE Transactions on Information Theory. 56 (10): 5168–5194. doi:10.1109/TIT.2010.2060095 – via IEEE Xplore.
  7. ^ https://icml.cc/2012/papers/625.pdf
  8. ^ "From kernels to causal inference". videolectures.net.
  9. ^ "Causal Learning --- Bernhard Schölkopf". October 15, 2017 – via Vimeo.
  10. ^ Schölkopf, Bernhard (February 6, 2015). "Learning to see and act". Nature. 518 (7540): 486–487. doi:10.1038/518486a – via www.nature.com.
  11. ^ Schölkopf, Bernhard; Hogg, David W.; Wang, Dun; Foreman-Mackey, Daniel; Janzing, Dominik; Simon-Gabriel, Carl-Johann; Peters, Jonas (July 5, 2016). "Modeling confounding by half-sibling regression". Proceedings of the National Academy of Sciences. 113 (27): 7391–7398.
  12. ^ https://www.leopoldina.org/fileadmin/redaktion/Mitglieder/CV_Schoelkopf_Bernhard_D.pdf
  13. ^ "TU Berlin - Medieninformation Nr. 209 - 17. September 1998". archiv.pressestelle.tu-berlin.de.
  14. ^ "History of the Institute". www.kyb.tuebingen.mpg.de.
  15. ^ https://www.mpg.de/4379702/MPR_2011_2.pdf
  16. ^ "Machine Learning Summer Schools - MLSS". mlss.cc.
  17. ^ "Cambridge Machine Learning Group | PhD Programme in Advanced Machine Learning".
  18. ^ Williams, Jonathan. "Max Planck ETH Center for Learning Systems". cls-staging.is.localnet.
  19. ^ "Service". Baden-Württemberg.de.
  20. ^ https://standards.ieee.org/content/dam/ieee-standards/standards/web/documents/other/ead_v1.pdf
  21. ^ "World's Top Computer Scientists: H-Index Computer Science Ranking". www.guide2research.com.
  22. ^ http://people.tuebingen.mpg.de/bs/alumni.htm
  23. ^ Williams, Jon. "Bernhard Schölkopf receives Frontiers of Knowledge Award | Empirical Inference". Max Planck Institute for Intelligent Systems.