Jump to content

Affective computing

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by Earthtodorian (talk | contribs) at 03:12, 26 October 2010 (Detecting and recognizing emotional information). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Affective Computing is also the title of a textbook on the subject by Rosalind Picard.

Affective computing is the study and development of systems and devices that can recognize, interpret, process, and simulate human affects. It is an interdisciplinary field spanning computer sciences, psychology, and cognitive science.[1] While the origins of the field may be traced as far back as to early philosophical enquiries into emotion,[2] the more modern branch of computer science originated with Rosalind Picard's 1995 paper[3] on affective computing.[4][5] A motivation for the research is the ability to simulate empathy. The machine should interpret the emotional state of humans and adapt its behaviour to them, giving an appropriate response for those emotions.

Areas of affective computing

Detecting and recognizing emotional information

Detecting emotional information begins with passive sensors which capture data about the user's physical state or behavior without interpreting the input. The data gathered is analogous to the cues humans use to perceive emotions in others. For example, a video camera might capture facial expressions, body posture and gestures, while a microphone might capture speech. Other sensors detect emotional cues by directly measuring physiological data, such as skin temperature and galvanic resistance.[6]

Recognizing emotional information requires the extraction of meaningful patterns from the gathered data. This is done using machine learning techniques that process different modalities speech recognition, natural language processing, or facial expression detection, and produce either labels (i.e. 'confused') or coordinates in a valence-arousal space. The state of the art literature was recently reviewed in [7]

Emotion in machines

Another area within affective computing is the design of computational devices proposed to exhibit either innate emotional capabilities or that are capable of convincingly simulating emotions. A more practical approach, based on current technological capabilities, is the simulation of emotions in conversational agents in order to enrich and facilitate interactivity between human and machine [8]. While human emotions are often associated with surges in hormones and other neuropeptides, emotions in machines might be associated with abstract states associated with progress (or lack of progress) in autonomous learning systems[citation needed]. In this view, affective emotional states correspond to time-derivatives (perturbations) in the learning curve of an arbitrary learning system.[citation needed]

Marvin Minsky, one of the pioneering computer scientists in artificial intelligence, relates emotions to the broader issues of machine intelligence stating in The Emotion Machine that emotion is "not especially different from the processes that we call 'thinking.'"[9]

Technologies of affective computing

Emotional speech

Emotional speech processing recognizes the user's emotional state by analyzing speech patterns. Vocal parameters and prosody features such as pitch variables and speech rate are analyzed through pattern recognition.[10][11]

Emotional inflection and modulation in synthesized speech, either through phrasing or acoustic features is useful in human-computer interaction. Such capability makes speech natural and expressive. For example a dialog system might modulate its speech to be more puerile if it deems the emotional model of its current user is that of a child.[citation needed]

Facial expression

The detection and processing of facial expression is achieved through various methods such as optical flow, hidden Markov model, neural network processing or active appearance model. More than one modalities can be combined or fused (multimodal recognition, e.g. facial expressions and speech prosody [12] or facial expressions and hand gestures [13]) to provide a more robust estimation of the subject's emotional state.

Body gesture

Body gesture is the position and the changes of the body. There are many proposed methods[14] to detect the body gesture. Hand gestures have been a common focus of body gesture detection, apparentness [vague]methods[15] and 3-D modeling methods are traditionally used.

Visual aesthetics

Aesthetics, in the world of art and photography, refers to the principles of the nature and appreciation of beauty. Judging beauty and other aesthetic qualities is a highly subjective task. Computer scientists at Penn State treat the challenge of automatically inferring aesthetic quality of pictures using their visual content as a machine learning problem, with a peer-rated on-line photo sharing Website as data source[16]. They extract certain visual features based on the intuition that they can discriminate between aesthetically pleasing and displeasing images. The work is demonstrated in the ACQUINE system[17] on the Web.

Potential applications

In e-learning applications, affective computing can be used to adjust the presentation style of a computerized tutor when a learner is bored, interested, frustrated, or pleased.[18] [19] Psychological health services, i.e. counseling, benefit from affective computing applications when determining a client's emotional state.[citation needed] Affective computing sends a message via color or sound to express an emotional state to others.[citation needed]

Robotic systems capable of processing affective information exhibit higher flexibility while one works in uncertain or complex environments. Companion devices, such as digital pets, use affective computing abilities to enhance realism and provide a higher degree of autonomy.[citation needed]

Other potential applications are centered around social monitoring. For example, a car can monitor the emotion of all occupants and engage in additional safety measures, such as alerting other vehicles if it detects the driver to be angry.[citation needed] Affective computing has potential applications in human computer interaction, such as affective mirrors allowing the user to see how he or she performs; emotion monitoring agents sending a warning before one sends an angry email; or even music players selecting tracks based on mood.[citation needed]

Affective computing is also being applied to the development of communicative technologies for use by people with autism.[20]

Application examples

See also

References

  1. ^ Tao, Jianhua (2005). "Affective Computing: A Review". Affective Computing and Intelligent Interaction. Vol. LNCS 3784. Springer. pp. 981–995. doi:10.1007/11573548. {{cite conference}}: Unknown parameter |booktitle= ignored (|book-title= suggested) (help); Unknown parameter |coauthors= ignored (|author= suggested) (help)
  2. ^ James, William (1884). "What is Emotion". Mind. 9: 188–205. doi:10.1093/mind/os-IX.34.188. Cited by Tao and Tan.
  3. ^ "Affective Computing" MIT Technical Report #321 (Abstract), 1995
  4. ^ Kleine-Cosack, Christian (2006). "Recognition and Simulation of Emotions" (PDF). Archived from the original (PDF) on May 28, 2008. Retrieved May 13, 2008. The introduction of emotion to computer science was done by Pickard (sic) who created the field of affective computing. {{cite web}}: Unknown parameter |month= ignored (help); line feed character in |quote= at position 74 (help)
  5. ^ Diamond, David (2003). "The Love Machine; Building computers that care". Wired. Retrieved May 13, 2008. Rosalind Picard, a genial MIT professor, is the field's godmother; her 1997 book, Affective Computing, triggered an explosion of interest in the emotional side of computers and their users. {{cite web}}: Unknown parameter |month= ignored (help)
  6. ^ Garay, Nestor (2006). "Assistive Technology and Affective Mediation" (PDF). Human Technology: an Interdisciplinary Journal on Humans in ICT Environments. 2 (1): 55–83. Retrieved 2008-05-12. {{cite journal}}: Unknown parameter |coauthors= ignored (|author= suggested) (help); Unknown parameter |month= ignored (help)
  7. ^ Calvo, Rafael A. (2010). "Affect Detection: An Interdisciplinary Review of Models, Methods, and Their Applications" (PDF). IEEE Transactions on Affective Computing. 1 (1): 18–37. Retrieved 2010-10-15. {{cite journal}}: Unknown parameter |coauthors= ignored (|author= suggested) (help); Unknown parameter |month= ignored (help)
  8. ^ Heise, David (2004). "Agent Culture: Human-Agent Interaction in a Mutlicultural World" (Document). Lawrence Erlbaum Associates. pp. 127–142. {{cite document}}: Unknown parameter |contribution= ignored (help); Unknown parameter |editor-first= ignored (help); Unknown parameter |editor-given= ignored (help)
  9. ^ Restak, Richard (2006-12-17). "Mind Over Matter". The Washington Post. Retrieved 2008-05-13.
  10. ^ Dellaert, F., Polizin, t., and Waibel, A., Recognizing Emotion in Speech", In Proc. Of ICSLP 1996, Philadelphia, PA, pp.1970-1973, 1996
  11. ^ Lee, C.M.; Narayanan, S.; Pieraccini, R., Recognition of Negative Emotion in the Human Speech Signals, Workshop on Auto. Speech Recognition and Understanding, Dec 2001
  12. ^ G. Caridakis, L. Malatesta, L. Kessous, N. Amir, A. Raouzaiou, K. Karpouzis, Modeling naturalistic affective states via facial and vocal expressions recognition, International Conference on Multimodal Interfaces (ICMI’06), Banff, Alberta, Canada, November 2-4, 2006
  13. ^ T. Balomenos, A. Raouzaiou, S. Ioannou, A. Drosopoulos, K. Karpouzis, S. Kollias, Emotion Analysis in Man-Machine Interaction Systems, Samy Bengio, Herve Bourlard (Eds.), Machine Learning for Multimodal Interaction, Lecture Notes in Computer Science, Vol. 3361, 2004, pp. 318 - 328, Springer-Verlag
  14. ^ J. K. Aggarwal, Q. Cai, Human Motion Analysis: A Review, Computer Vision and Image Understanding, Vol. 73, No. 3, 1999
  15. ^ Vladimir I. Pavlovic, Rajeev Sharma, Thomas S. Huang, Visual Interpretation of Hand Gestures for Human-Computer Interaction; A Review, IEEE Transactions on Pattern Analysis and Machine Intelligence, 1997
  16. ^ Ritendra Datta, Dhiraj Joshi, Jia Li and James Z. Wang, Studying Aesthetics in Photographic Images Using a Computational Approach, Lecture Notes in Computer Science, vol. 3953, Proceedings of the European Conference on Computer Vision, Part III, pp. 288-301, Graz, Austria, May 2006.
  17. ^ http://acquine.alipr.com
  18. ^ AutoTutor
  19. ^ S. Asteriadis, P. Tzouveli, K. Karpouzis, S. Kollias, Estimation of behavioral user state based on eye gaze and head pose—application in an e-learning environment, Multimedia Tools and Applications, Springer, Volume 41, Number 3 / February, 2009, pp. 469-493.
  20. ^ Projects in Affective Computing
  21. ^ The Humaine Association