Alex Graves (computer scientist): Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
rm vague, unsourced statement
references
Line 7: Line 7:
| birth_date =
| birth_date =
| birth_place =
| birth_place =
| thesis_title = Supervised sequence labelling with recurrent neural networks
| occupation = Computer scientist
| years_active =
| thesis_year = 2008
| thesis_url = https://www.worldcat.org/oclc/1184353689
| known_for = {{Unbulleted list|[[Connectionist temporal classification]] | [[Neural Turing machine]] | [[Differentiable neural computer]]}}
| known_for = {{Unbulleted list|[[Connectionist temporal classification]] | [[Neural Turing machine]] | [[Differentiable neural computer]]}}
| fields = [[Artificial Intelligence]]<br>[[Recurrent neural network]]s<br>[[Handwriting recognition]]<br>[[Speech recognition]]<ref name=gs/>
| education = {{Unbulleted list|[[University of Edinburgh]] ([[Bachelor of Science|BSc]]) | [[IDSIA]] ([[Doctor of Philosophy|PhD]])}}
| education = {{Unbulleted list|[[University of Edinburgh]] ([[Bachelor of Science|BSc]]) | [[IDSIA]] ([[Doctor of Philosophy|PhD]])}}
| workplaces = [[DeepMind]]
| workplaces = [[DeepMind]]<br>[[University of Toronto]]
}}
}}
'''Alex Graves''' is a computer scientist and research scientist at [[DeepMind]].<ref name=gs>{{Google scholar id}}</ref>
'''Alex Graves''' is a computer scientist. Before working as a research scientist at [[DeepMind]], he earned a BSc in Theoretical Physics from the [[University of Edinburgh]] and a PhD in [[artificial intelligence]] under [[Jürgen Schmidhuber]] at [[IDSIA]].<ref>{{cite web |title=Alex Graves |url=http://www.cifar.ca/alex-graves |website=Canadian Institute for Advanced Research |archive-url=https://web.archive.org/web/20150501222647/http://www.cifar.ca/alex-graves |archive-date=1 May 2015}}</ref> He was also a postdoc under Schmidhuber at the [[Technical University of Munich]] and under [[Geoffrey Hinton]]<ref name="MyUser_Blog.mikiobraun.de_May_17_2016c">{{cite web |url=http://blog.mikiobraun.de/2014/01/what-deepmind-google.html |title=Marginally Interesting: What is going on with DeepMind and Google? |newspaper=Blog.mikiobraun.de |date= 28 January 2014|author= |accessdate= May 17, 2016}}</ref> at the [[University of Toronto]].


==Education==
At [[IDSIA]], Graves trained [[long short-term memory]] neural networks by a novel method called [[connectionist temporal classification]] (CTC).<ref name="graves2006">Alex Graves, Santiago Fernandez, Faustino Gomez, and [[Jürgen Schmidhuber]] (2006). Connectionist temporal classification: Labelling unsegmented sequence data with recurrent neural nets. Proceedings of ICML’06, pp. 369–376.</ref> This method outperformed traditional [[speech recognition]] models in certain applications.<ref name="fernandez2007keyword">Santiago Fernandez, Alex Graves, and Jürgen Schmidhuber (2007). An application of recurrent neural networks to discriminative keyword spotting. Proceedings of ICANN (2), pp. 220–229.</ref> In 2009, his CTC-trained LSTM was the first [[recurrent neural network]] to win pattern recognition contests, winning several competitions in connected [[handwriting recognition]].<ref>Graves, Alex; and Schmidhuber, Jürgen; ''Offline Handwriting Recognition with Multidimensional Recurrent Neural Networks'', in Bengio, Yoshua; Schuurmans, Dale; Lafferty, John; Williams, Chris K. I.; and Culotta, Aron (eds.), ''Advances in Neural Information Processing Systems 22 (NIPS'22), December 7th–10th, 2009, Vancouver, BC'', Neural Information Processing Systems (NIPS) Foundation, 2009, pp. 545–552</ref><ref>A. Graves, M. Liwicki, S. Fernandez, R. Bertolami, H. Bunke, J. Schmidhuber. A Novel Connectionist System for Improved Unconstrained Handwriting Recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 5, 2009.</ref>
Graves earned his [[Bachelor of Science]] degree in [[Theoretical Physics]] from the [[University of Edinburgh]]{{when}} and a PhD in [[artificial intelligence]] under [[Jürgen Schmidhuber]] at [[IDSIA]].<ref>{{cite web |title=Alex Graves |url=http://www.cifar.ca/alex-graves |website=Canadian Institute for Advanced Research |archive-url=https://web.archive.org/web/20150501222647/http://www.cifar.ca/alex-graves |archive-date=1 May 2015}}</ref>

==Career and research==
After his PhD, Graves was [[postdoc]] working with Schmidhuber at the [[Technical University of Munich]] and [[Geoffrey Hinton]]<ref name="MyUser_Blog.mikiobraun.de_May_17_2016c">{{cite web |url=http://blog.mikiobraun.de/2014/01/what-deepmind-google.html |title=Marginally Interesting: What is going on with DeepMind and Google? |newspaper=Blog.mikiobraun.de |date= 28 January 2014|author= |accessdate= May 17, 2016}}</ref> at the [[University of Toronto]].

At [[IDSIA]], Graves trained [[long short-term memory]] neural networks by a novel method called [[connectionist temporal classification]] (CTC).<ref name="graves2006">Alex Graves, Santiago Fernandez, Faustino Gomez, and [[Jürgen Schmidhuber]] (2006). Connectionist temporal classification: Labelling unsegmented sequence data with recurrent neural nets. Proceedings of ICML’06, pp. 369–376.</ref> This method outperformed traditional [[speech recognition]] models in certain applications.<ref name="fernandez2007keyword">Santiago Fernandez, Alex Graves, and Jürgen Schmidhuber (2007). An application of recurrent neural networks to discriminative keyword spotting. Proceedings of ICANN (2), pp. 220–229. {{doi|10.1007/978-3-540-74695-9_23}}</ref> In 2009, his CTC-trained [[long short-term memory]] (LSTM) was the first [[recurrent neural network]] (RNN) to win pattern recognition contests, winning several competitions in connected [[handwriting recognition]].<ref>Graves, Alex; and Schmidhuber, Jürgen; ''Offline Handwriting Recognition with Multidimensional Recurrent Neural Networks'', in Bengio, Yoshua; Schuurmans, Dale; Lafferty, John; Williams, Chris K. I.; and Culotta, Aron (eds.), ''Advances in Neural Information Processing Systems 22 (NIPS'22), December 7th–10th, 2009, Vancouver, BC'', Neural Information Processing Systems (NIPS) Foundation, 2009, pp. 545–552 {{doi|10.5555/2981780.2981848}}</ref><ref>A. Graves, M. Liwicki, S. Fernandez, R. Bertolami, H. Bunke, J. Schmidhuber. A Novel Connectionist System for Improved Unconstrained Handwriting Recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 5, 2009. {{doi|10.1109/TPAMI.2008.137}}</ref>
[[Google]] uses CTC-trained LSTM for speech recognition on the [[smartphone]].<ref name="GoogleVoiceTranscription">Google Research Blog. The neural networks behind Google Voice transcription. August 11, 2015. By Françoise Beaufays http://googleresearch.blogspot.co.at/2015/08/the-neural-networks-behind-google-voice.html</ref><ref name="GoogleVoiceSearch">Google Research Blog. Google voice search: faster and more accurate. September 24, 2015. By Haşim Sak, Andrew Senior, Kanishka Rao, Françoise Beaufays and Johan Schalkwyk – Google Speech Team http://googleresearch.blogspot.co.uk/2015/09/google-voice-search-faster-and-more.html</ref>
[[Google]] uses CTC-trained LSTM for speech recognition on the [[smartphone]].<ref name="GoogleVoiceTranscription">Google Research Blog. The neural networks behind Google Voice transcription. August 11, 2015. By Françoise Beaufays http://googleresearch.blogspot.co.at/2015/08/the-neural-networks-behind-google-voice.html</ref><ref name="GoogleVoiceSearch">Google Research Blog. Google voice search: faster and more accurate. September 24, 2015. By Haşim Sak, Andrew Senior, Kanishka Rao, Françoise Beaufays and Johan Schalkwyk – Google Speech Team http://googleresearch.blogspot.co.uk/2015/09/google-voice-search-faster-and-more.html</ref>



Revision as of 06:57, 3 April 2024

Alex Graves
Education
Known for
Scientific career
FieldsArtificial Intelligence
Recurrent neural networks
Handwriting recognition
Speech recognition[1]
InstitutionsDeepMind
University of Toronto
ThesisSupervised sequence labelling with recurrent neural networks (2008)

Alex Graves is a computer scientist and research scientist at DeepMind.[1]

Education

Graves earned his Bachelor of Science degree in Theoretical Physics from the University of Edinburgh[when?] and a PhD in artificial intelligence under Jürgen Schmidhuber at IDSIA.[2]

Career and research

After his PhD, Graves was postdoc working with Schmidhuber at the Technical University of Munich and Geoffrey Hinton[3] at the University of Toronto.

At IDSIA, Graves trained long short-term memory neural networks by a novel method called connectionist temporal classification (CTC).[4] This method outperformed traditional speech recognition models in certain applications.[5] In 2009, his CTC-trained long short-term memory (LSTM) was the first recurrent neural network (RNN) to win pattern recognition contests, winning several competitions in connected handwriting recognition.[6][7] Google uses CTC-trained LSTM for speech recognition on the smartphone.[8][9]

Graves is also the creator of neural Turing machines[10] and the closely related differentiable neural computer.[11][12] In 2023, he published the paper Bayesian Flow Networks.[13]

References

  1. ^ a b Alex Graves publications indexed by Google Scholar Edit this at Wikidata
  2. ^ "Alex Graves". Canadian Institute for Advanced Research. Archived from the original on 1 May 2015.
  3. ^ "Marginally Interesting: What is going on with DeepMind and Google?". Blog.mikiobraun.de. 28 January 2014. Retrieved May 17, 2016.
  4. ^ Alex Graves, Santiago Fernandez, Faustino Gomez, and Jürgen Schmidhuber (2006). Connectionist temporal classification: Labelling unsegmented sequence data with recurrent neural nets. Proceedings of ICML’06, pp. 369–376.
  5. ^ Santiago Fernandez, Alex Graves, and Jürgen Schmidhuber (2007). An application of recurrent neural networks to discriminative keyword spotting. Proceedings of ICANN (2), pp. 220–229. doi:10.1007/978-3-540-74695-9_23
  6. ^ Graves, Alex; and Schmidhuber, Jürgen; Offline Handwriting Recognition with Multidimensional Recurrent Neural Networks, in Bengio, Yoshua; Schuurmans, Dale; Lafferty, John; Williams, Chris K. I.; and Culotta, Aron (eds.), Advances in Neural Information Processing Systems 22 (NIPS'22), December 7th–10th, 2009, Vancouver, BC, Neural Information Processing Systems (NIPS) Foundation, 2009, pp. 545–552 doi:10.5555/2981780.2981848
  7. ^ A. Graves, M. Liwicki, S. Fernandez, R. Bertolami, H. Bunke, J. Schmidhuber. A Novel Connectionist System for Improved Unconstrained Handwriting Recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 31, no. 5, 2009. doi:10.1109/TPAMI.2008.137
  8. ^ Google Research Blog. The neural networks behind Google Voice transcription. August 11, 2015. By Françoise Beaufays http://googleresearch.blogspot.co.at/2015/08/the-neural-networks-behind-google-voice.html
  9. ^ Google Research Blog. Google voice search: faster and more accurate. September 24, 2015. By Haşim Sak, Andrew Senior, Kanishka Rao, Françoise Beaufays and Johan Schalkwyk – Google Speech Team http://googleresearch.blogspot.co.uk/2015/09/google-voice-search-faster-and-more.html
  10. ^ "Google's Secretive DeepMind Startup Unveils a "Neural Turing Machine"". Retrieved May 17, 2016.
  11. ^ Graves, Alex; Wayne, Greg; Reynolds, Malcolm; Harley, Tim; Danihelka, Ivo; Grabska-Barwińska, Agnieszka; Colmenarejo, Sergio Gómez; Grefenstette, Edward; Ramalho, Tiago (2016-10-12). "Hybrid computing using a neural network with dynamic external memory". Nature. 538 (7626): 471–476. Bibcode:2016Natur.538..471G. doi:10.1038/nature20101. ISSN 1476-4687. PMID 27732574. S2CID 205251479.
  12. ^ "Differentiable neural computers | DeepMind". DeepMind. Retrieved 2016-10-19.
  13. ^ Alex Graves; Rupesh Kumar Srivastava; Timothy Atkinson; Faustino Gomez (14 August 2023), Bayesian Flow Networks (PDF), arXiv:2308.07037, doi:10.48550/ARXIV.2308.07037, Wikidata Q121625910