Artificial empathy

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

Artificial empathy (AE) is the development of AI systems − such as companion robots − that are able to detect and respond to human emotions. According to scientists, although the technology can be perceived as scary or threatening by many people[1], it could also have a significant advantage over humans in professions which are traditionally involved in emotional role-playing such as the health care sector.[2] From the care-giver perspective for instance, performing emotional labor above and beyond the requirements of paid labor often results in chronic stress or burnout, and the development of a feeling of being desensitized to patients. However, it is argued that the emotional role-playing between the care-receiver and a robot can actually have a more positive outcome in terms of creating the conditions of less fear and concern for one's own predicament best exemplified by the phrase: "if it is just a robot taking care of me it cannot be that critical." Scholars debate the possible outcome of such technology using two different perspectives. Either, the AE could help the socialization of care-givers, or serve as role model for emotional detachment.[2][3]

Areas of research[edit]

There are a variety of philosophical, theoretical, and applicative questions related to AE. For example:

  1. Which conditions would have to be met for a robot to respond competently to a human emotion?
  2. What models of empathy can or should be applied to Social and Assistive Robotics?
  3. Does the interaction of humans with robots have to imitate affective interaction between humans?
  4. Can a robot help science learn about affective development of humans?
  5. Would robots create unforeseen categories of inauthentic relations?
  6. What relations with robots can be considered truly authentic? [4]

See also[edit]

References[edit]

  1. ^ Jan-Philipp Stein; Peter Ohler (2017). "Venturing into the uncanny valley of mind—The influence of mind attribution on the acceptance of human-like characters in a virtual reality setting". Cognition. 160: 43–50. doi:10.1016/j.cognition.2016.12.010. ISSN 0010-0277. 
  2. ^ a b Bert Baumgaertner; Astrid Weiss (26 February 2014). "Do Emotions Matter in the Ethics of Human-Robot Interaction?" (PDF). Artificial Empathy and Companion Robots. European Community’s Seventh Framework Programme (FP7/2007-2013) under grant agreement No. 288146 (“HOBBIT”); and the Austrian Science Foundation (FWF) under grant agreement T623-N23 (“V4HRC”) – via direct download. 
  3. ^ Minoru Asada (14 February 2014). "Affective Developmental Robotics" (PDF). How can we design the development of artifcial empathy?. Osaka, Japan: Dept. of Adaptive Machine Systems, Graduate School of Engineering, Osaka University – via direct download. 
  4. ^ Luisa Damiano; Paul Dumouchel; Hagen Lehmann (6 February 2014). "Artificial Empathy: An Interdisciplinary Investigation" (PDF). Special issue. IJSR – via direct download.