Rana el Kaliouby
Rana el Kaliouby
|Born||1978 (age 40–41)|
|Education||University of Cambridge|
|Occupation||Computer scientist, Entrepreneur|
|Title||CEO at Affectiva|
Rana el Kaliouby (born 1978) is an Egyptian-American computer scientist and entrepreneur in the field of expression recognition research and technology development, which is a subset of facial recognition designed to identify the emotions expressed by the face. El Kaliouby's research moved beyond the field's dependence on exaggerated or caricatured expressions modeled by laboratory actors, to focus on the subtle glances found in real situations. She is the co-founder, with Rosalind Picard, and CEO of Affectiva.
El Kaliouby worked as a research scientist at Massachusetts Institute of Technology, helping to found their Autism & Communication Technology Initiative. Her original goal was to improve human-computer interaction, but she quickly became fascinated with the possibility of applying this technology to improve human-human communication, especially for autistic people, many of whom struggle with emotional communication. At the Affective Computing group of MIT Media Lab, she was part of a team that pioneered development of the "emotional hearing aid", a set of emotion-reading wearable glasses which the New York Times included in their Top 100 innovations of 2006. El Kaliouby demonstrates her work and is interviewed in the 2018 documentary on artificial intelligence Do You Trust This Computer?
El Kaliouby has stated that computers, while good with information, fall short when it comes to determining feelings, thus requiring manual prompting to respond to an operator's needs. Her work primarily lies in the subtle facial changes that people tend to make. She has identified 24 landmarks to the face, each moving in different ways, depending on an expressed emotion. This has many applications, from linguistics to video production. Autism patients, who typically have a different array of expressions that are apart from the norm, may be able to have their moods more easily monitored by parents or caretakers. For production purposes, computer generated imagery of faces (and presumably android projects) will be able to be more realistic in the art of subtlety.[original research?]
Leading Affectiva's Emotion Science team; the company applies computer vision, machine learning and data science to leverage the company's facial emotion repository, which it says is the world's largest with 2 million faces, to understand people's feelings and behaviors.
Awards and recognition
El Kaliouby was inducted into the "Women in Engineering" Hall of Fame. She is also a member of ACM, IEEE, Association of Children's Museums, British Machine Vision Association, and Nahdet el Mahrousa. Other awards include:
- 7 Women to Watch in 2014 – Entrepreneur Magazine
- Mass High Tech Top 20 Women to Watch 2014
- The Wired Smart List – Wired 2013
- MIT TR35 2012
- Smithsonian Magazine American Ingenuity Award in Technology
- Forbes America's Top 50 Women In Tech 2018
- "MIT Technology Review 2012". Retrieved 30 July 2014.
- El-Kaliouby, Rana (2005). Mind-reading machines: automated inference of complex mental states (Technical report). University of Cambridge, Computer Laboratory. UCAM-CL-TR-636.
- "Archived copy". Archived from the original on 2014-11-29. Retrieved 2014-11-14.CS1 maint: Archived copy as title (link)
- "Autism Spectrum Disorder - Struggling with Communication".
- El-Kaliouby, Rana; Robinson, Peter (2005-12-01). "The emotional hearing aid: an assistive tool for children with Asperger syndrome". Universal Access in the Information Society. 4 (2): 121–134. doi:10.1007/s10209-005-0119-0.
- The Social-Cue Reader, New York Times Magazine December 12, 2006 [note: this article was their writeup on the innovation, but it does not actually make the statement that it got into their top 100. That's probably in some other article which we could do with a citation for. This may involve finding it in a library because it's no longer online. There are plenty of secondary-source statements out there that say this innovation got into their top 100, but it would be nice to be able to cite the list directly.]
- Karen Weintraub, "Teaching devices to tell a frown from a smile", Innovators Under 35, Date not specified.
- "Affectiva Company Team". Retrieved 9 November 2016.
- "Affectiva builds world's largest emotion analytics repository with 2 million faces analyzed".
- The Women in Engineering Hall of Fame
- "The 7 Most Powerful Women to Watch in 2014". 2014-01-03.
- "The Wired Smart List 2013". Wired UK. 2013-12-09.
- "Innovators Under 35 2012".
- "2015 American Ingenuity Award Winners". Smithsonian Magazine. Smithsonian. Retrieved 12 October 2018.
- Helen A. S. Popkin with Lauren Aratani and Samar Marwan, eds. (29 November 2018). "Rana el Kaliouby". Forbes. Retrieved 3 December 2018.CS1 maint: Uses editors parameter (link)
- Falcon, William (November 30, 2018). "This Is The Future Of AI According To 23 World-Leading AI Experts". Forbes. Retrieved March 20, 2019.
- Rana el Kaliouby at TED
- "This app knows how you feel — from the look on your face" (TEDWomen 2015)
- When algorithms grow accustomed to your face | Nov. 2013 – New York Times.
- 25 Most Audacious Companies | April 2013 – Inc.
- The New Face of AdTech Goes Consumer | Aug 2012 – TechCrunch.
- Does Your Phone Know How Happy You Are? | June 2012 – Fast Company.
- Eisenberg, Anne (2013-11-30). "When Algorithms Grow Accustomed to Your Face". The New York Times.
- "The New Face of AdTech Goes Consumer: Emotion Tracker Affectiva Gets $12M from KPCB, Horizon".
- "Does Your Phone Know How Happy You Are? The Emotion-Recognition Industry Comes Giddily of Age". 2012-06-07.