Rana el Kaliouby
Rana el Kaliouby | |
---|---|
Born | 1978 (age 45–46) |
Education | American University in Cairo (BS, MS) Newnham College, Cambridge (PhD) |
Title | CEO at Affectiva |
Children | 2 |
Website | affectiva |
Rana el Kaliouby (Arabic: رنا القليوبي; born 1978) is an Egyptian-American computer scientist and entrepreneur in the field of expression recognition research and technology development, which is a subset of facial recognition designed to identify the emotions expressed by the face.[1] El Kaliouby's research moved beyond the field's dependence on exaggerated or caricatured expressions modeled by laboratory actors, to focus on the subtle glances found in real situations.[citation needed] She is the co-founder, with Rosalind Picard, and CEO of Affectiva.
A pioneer in Artificial Intelligence as well as the co-founder and CEO of Affectiva, an AI startup spun off from the MIT Media Lab. After growing up in Cairo, Egypt, she earned a PhD in Cambridge University, and then joined the MIT Media Lab as a research scientist, where she spearheaded the application of emotion recognition technology in a variety of fields, including mental health and autism. She left MIT to co-found Affectiva, a company credited with defining the field of Emotion AI and now works with 25% of the Fortune 500 and is a leader in emotion AI. [2] Rana was named by Forbes to their list of America's Top 50 Women in Tech[3], and Fortune included her in their list of 40 under 40[4] and was chosen by the World Economic Forum to be a Young Global Leader[5] and member of WEF’s Future Global Council on Robotics and Artificial Intelligence. She speaks regularly on ethics in AI and fighting bias in AI at conferences from the Aspen Ideas Festival to the Wall Street Journal’s Future of Everything. She hosted a PBS Nova.[6]
Rana el Kaliouby is on a mission to humanize technology with artificial emotional intelligence, or what she calls “Emotion AI.” through developing a “deep learning” platform that combines facial expression with tone of voice to infer how a person is feeling. She is the author of "Girl Decoded: A scientist's quest to reclaim humanity by bringing emotional intelligence to technology".[7]
Education
El Kaliouby earned a Bachelors and Master of Science degree from the American University in Cairo, then a Ph.D at Newnham College, Cambridge.[8]
Career
El Kaliouby worked as a research scientist at Massachusetts Institute of Technology, helping to found their Autism & Communication Technology Initiative.[9] Her original goal was to improve human-computer interaction, but she quickly became fascinated with the possibility of applying this technology to improve human-human communication, especially for autistic people, many of whom struggle with emotional communication.[10] At the Affective Computing group of MIT Media Lab, she was part of a team that pioneered development of the "emotional hearing aid",[11] a set of emotion-reading wearable glasses which the New York Times included in their Top 100 innovations of 2006.[12] El Kaliouby demonstrates her work and is interviewed in the 2018 documentary on artificial intelligence Do You Trust This Computer?
El Kaliouby has stated that computers, while good with information, fall short when it comes to determining feelings, thus requiring manual prompting to respond to an operator's needs. Her work primarily lies in the subtle facial changes that people tend to make. She has identified 24 landmarks to the face, each moving in different ways, depending on an expressed emotion.[13] This has many applications, from linguistics to video production. Autism patients, who typically have a different array of expressions that are apart from the norm, may be able to have their moods more easily monitored by parents or caretakers. For production purposes, computer generated imagery of faces (and presumably android projects) will be able to be more realistic in the art of subtlety.[original research?][citation needed]
Leading Affectiva's Emotion Science team;[14] the company applies computer vision, machine learning and data science to leverage the company's facial emotion repository, which has now grown to nearly 6 million faces analyzed in 75 countries with 5,313,751 face videos, for a total of 38,944 hours of data, representing nearly 2 billion facial frames analyzed,[15] to understand people's feelings and behaviors.[16]
In 2016, she became the CEO of Affectiva.[2]
In November 2019, Affectiva was covered as a case study at Harvard Business School on the Emotion AI category with Professor Shane Greenstein.[17]
Awards and recognition
El Kaliouby was inducted into the "Women in Engineering" Hall of Fame.[14][18] She is also a member of ACM, IEEE, Association of Children's Museums, British Machine Vision Association, and Nahdet el Mahrousa.[19] Other awards include:
- 7 Women to Watch in 2014 – Entrepreneur Magazine[20]
- Mass High Tech Top 20 Women to Watch 2014[21]
- The Wired Smart List – Wired 2013[22]
- MIT TR35 2012[23]
- Smithsonian Magazine American Ingenuity Award in Technology[24]
- Forbes America's Top 50 Women In Tech 2018[25]
- BBC 100 Women in 2019[26]
Books
El Kaliouby's memoir Girl Decoded is due to be published in April 2020.[7]
El Kaliouby also contributed one chapter to the 2018 book Architects of Intelligence: The Truth About AI from the People Building it by the American futurist Martin Ford.[27]
References
- ^ "MIT Technology Review 2012". Retrieved 30 July 2014.
- ^ a b Affectiva. "Affectiva Co-Founder and CEO, Rana el Kaliouby". go.affectiva.com. Retrieved 2020-01-06.
- ^ Communications, Forbes Corporate. "Forbes Releases 2018 US List of Top 50 Women in Tech". Forbes. Retrieved 2020-01-06.
- ^ "Rana el Kaliouby". Fortune. Retrieved 2020-01-06.
- ^ "Meet 10 of the Young Global Leaders creating a sustainable future". World Economic Forum. Retrieved 2020-01-06.
- ^ "Profile: Rana el Kaliouby". www.pbs.org. Retrieved 2020-01-06.
- ^ a b El Kaliouby, Rana; Colman, Carol (2020). Girl Decoded: A Scientist's Quest to Reclaim our Humanity by Bringing Emotional Intelligence to Technology. Penguin Random House. ISBN 9781984824769.
- ^ El-Kaliouby, Rana (2005). Mind-reading machines: automated inference of complex mental states (Technical report). University of Cambridge, Computer Laboratory. UCAM-CL-TR-636.
- ^ "Archived copy". Archived from the original on 2014-11-29. Retrieved 2014-11-14.
{{cite web}}
: CS1 maint: archived copy as title (link) - ^ "Autism Spectrum Disorder - Struggling with Communication".
- ^ El-Kaliouby, Rana; Robinson, Peter (2005-12-01). "The emotional hearing aid: an assistive tool for children with Asperger syndrome". Universal Access in the Information Society. 4 (2): 121–134. doi:10.1007/s10209-005-0119-0.
- ^ The Social-Cue Reader, New York Times Magazine December 12, 2006 [note: this article was their writeup on the innovation, but it does not actually make the statement that it got into their top 100. That's probably in some other article which we could do with a citation for. This may involve finding it in a library because it's no longer online. There are plenty of secondary-source statements out there that say this innovation got into their top 100, but it would be nice to be able to cite the list directly.]
- ^ Karen Weintraub, "Teaching devices to tell a frown from a smile", Innovators Under 35, Date not specified.
- ^ a b "Affectiva Company Team". Retrieved 9 November 2016.
- ^ "The World's Largest Emotion Database: 5.3 Million Faces and Counting".
{{cite web}}
: CS1 maint: url-status (link) - ^ "Affectiva Releases Findings from Largest Cross-Cultural Study on Gender Differences in Facial Expressions". www.businesswire.com. 2017-04-19. Retrieved 2020-01-06.
- ^ "Rana el Kaliouby". Harvard Business School Digital Initiative. Retrieved 2020-01-06.
- ^ The Women in Engineering Hall of Fame
- ^ "Linkedin of Rana el Kaliouby".
- ^ "The 7 Most Powerful Women to Watch in 2014". 2014-01-03.
- ^ http://www.bizjournals.com/boston/event/100331
- ^ "The Wired Smart List 2013". Wired UK. 2013-12-09.
- ^ "Innovators Under 35 2012".
- ^ "2015 American Ingenuity Award Winners". Smithsonian Magazine. Smithsonian. Retrieved 12 October 2018.
- ^ Helen A. S. Popkin; Lauren Aratani; Samar Marwan, eds. (29 November 2018). "Rana el Kaliouby". Forbes. Retrieved 3 December 2018.
- ^ "BBC 100 Women 2019: Who is on the list this year?". BBC News. 16 October 2019.
- ^ Falcon, William (November 30, 2018). "This Is The Future Of AI According To 23 World-Leading AI Experts". Forbes. Retrieved March 20, 2019.
External links
This section's use of external links may not follow Wikipedia's policies or guidelines. (March 2020) |
- Rana el Kaliouby at TED
- "This app knows how you feel — from the look on your face" (TEDWomen 2015)
- When algorithms grow accustomed to your face | Nov. 2013 – New York Times.[1]
- 25 Most Audacious Companies | April 2013 – Inc.[2]
- The New Face of AdTech Goes Consumer | Aug 2012 – TechCrunch.[3]
- Does Your Phone Know How Happy You Are? | June 2012 – Fast Company.[4]
- Khatchadourian, Raffi (January 19, 2015). "We know how you feel: Computers are learning to read emotion, and the business world can't wait". The New Yorker.
- Rivera, Michael (September 26, 2018). "Profile: Rana el Kaliouby". PBS Nova.
- el Kaliouby, Rana. "Planning to Use A.I. in 2020? You Need to Make These Resolutions First". Inc. This Morning.
- ^ Eisenberg, Anne (2013-11-30). "When Algorithms Grow Accustomed to Your Face". The New York Times.
- ^ http://www.inc.com/audacious-companies/april-joyner/affectiva.html
- ^ "The New Face of AdTech Goes Consumer: Emotion Tracker Affectiva Gets $12M from KPCB, Horizon".
- ^ "Does Your Phone Know How Happy You Are? The Emotion-Recognition Industry Comes Giddily of Age". 2012-06-07.