Jump to content

Uncanny valley

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 67.176.134.41 (talk) at 00:25, 19 February 2011 (some random youtube video is not a reliable source of info for this assertion). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

The uncanny valley is a hypothesis regarding the field of robotics.[1] The theory holds that when robots and other facsimiles of humans look and act almost like actual humans, it causes a response of revulsion among human observers. The "valley" in question is a dip in a proposed graph of the positivity of human reaction as a function of a robot's lifelikeness.

The term was coined by roboticist Masahiro Mori as Bukimi no Tani Genshō (不気味の谷現象) in 1970, and has been linked to Ernst Jentsch's concept of "the uncanny" identified in a 1906 essay, "On the Psychology of the Uncanny".[2] Jentsch's conception is famously elaborated upon by Sigmund Freud in a 1919 essay titled "The Uncanny" ("Das Unheimliche").[3] A similar problem exists in realistic 3D computer animation.[4][5]

Hypothesis

Hypothesized emotional response of human subjects is plotted against anthropomorphism of a robot, following Mori's statements. The uncanny valley is the region of negative emotional response towards robots that seem "almost human". Movement amplifies the emotional response.[6]

Mori's hypothesis states that as a robot is made more humanlike in its appearance and motion, the emotional response from a human being to the robot will become increasingly positive and empathic, until a point is reached beyond which the response quickly becomes that of strong revulsion. However, as the appearance and motion continue to become less distinguishable from a human being, the emotional response becomes positive once more and approaches human-to-human empathy levels.[7]

This area of repulsive response aroused by a robot with appearance and motion between a "barely human" and "fully human" entity is called the uncanny valley. The name captures the idea that a robot which is "almost human" will seem overly "strange" to a human being and thus will fail to evoke the empathic response required for productive human-robot interaction.[7]

Theoretical basis

Repliee Q2

A number of theories have been proposed to explain the cognitive mechanism underlying the phenomenon:

  • Mate selection. Automatic, stimulus-driven appraisals of uncanny stimuli elicit aversion by activating an evolved cognitive mechanism for the avoidance of selecting mates with low fertility, poor hormonal health, or ineffective immune systems based on visible features of the face and body that are predictive of those traits.[8][9]
  • Mortality salience. Viewing an "uncanny" robot elicits an innate fear of death and culturally-supported defenses for coping with death’s inevitability.... [P]artially disassembled androids... play on subconscious fears of reduction, replacement, and annihilation: (1) A mechanism with a human facade and a mechanical interior plays on our subconscious fear that we are all just soulless machines. (2) Androids in various states of mutilation, decapitation, or disassembly are reminiscent of a battlefield after a conflict and, as such, serve as a reminder of our mortality. (3) Since most androids are copies of actual people, they are Doppelgaenger and may elicit a fear of being replaced, on the job, in a relationship, and so on. (4) The jerkiness of an android’s movements could be unsettling because it elicits a fear of losing bodily control."[10]
  • Pathogen avoidance. Uncanny stimuli may activate a cognitive mechanism that originally evolved to motivate the avoidance of potential sources of pathogens by eliciting a disgust response. “The more human an organism looks, the stronger the aversion to its defects, because (1) defects indicate disease, (2) more human-looking organisms are more closely related to human beings genetically, and (3) the probability of contracting disease-causing bacteria, viruses, and other parasites increases with genetic similarity.”[9][11] Thus, the visual anomalies of android robots and animated human characters have the same effect as those of corpses and visibly diseased individuals: the elicitation of alarm and revulsion.
  • Sorites paradoxes. Stimuli with human and nonhuman traits undermine our sense of human identity by linking qualitatively different categories, human and nonhuman, by a quantitative metric, degree of human likeness.[12]
  • Violation of human norms. The uncanny valley may "be symptomatic of entities that elicit a model of a human other but do not measure up to it."[13] If an entity looks sufficiently nonhuman, its human characteristics will be noticeable, generating empathy. However, if the entity looks almost human, it will elicit our model of a human other and its detailed normative expectations. The nonhuman characteristics will be noticeable, giving the human viewer a sense of strangeness. In other words, a robot stuck inside the uncanny valley is no longer being judged by the standards of a robot doing a passable job at pretending to be human, but is instead being judged by the standards of a human doing a terrible job at acting like a normal person.
  • Western, Middle Eastern, and religious constructions of human identity. The existence of artificial but humanlike entities is viewed as a threat to the concept of human identity, as constructed in the West and the Middle East. This is particularly the case with the Abrahamic religions (Christianity, Islam, and Judaism), which emphasize human uniqueness.[14] An example can be found in the theoretical framework of psychiatrist Irvin Yalom. Yalom explains that humans construct psychological defenses in order to avoid existential anxiety stemming from death. One of these defenses is "specialness", the irrational belief that aging and death as central premises of life apply to all others but oneself.[15] The experience of the very humanlike "living" robot can be so rich and compelling that it challenges humans' notions of "specialness" and existential defenses, eliciting existential anxiety.

Research

Research into the evolutionary mechanism behind the aversion associated with the uncanny valley was examined with one recent study. A group of five monkeys were shown three images: two different 3D monkey faces (realistic, unrealistic), and a real photo of a monkey's face. The monkeys' eye-gaze was used as a proxy for preference or aversion. Since the realistic 3D monkey face was looked at less than either the real photo, or the unrealistic 3D monkey face, this was interpreted as an indication that the monkey participants found the realistic 3D face aversive, or otherwise preferred the other two images. As one would expect with the uncanny valley, more realism can lead to less positive reactions, and this study demonstrated that neither human-specific cognitive processes, nor human culture explain the uncanny valley. In other words, this aversive reaction to realism can be said to be evolutionary in origin.[16]

Researchers at University of California San Diego are testing human brain activations related to the uncanny valley.[17] “We aim to improve our understanding of how the human brain enables social cognition, and to help engineers and designers develop interactive agents that are well-suited to their application domains as well as to the brains of their creators,” said Saygin, leading the research.

Viewer perception of facial expression and speech and the Uncanny Valley in realistic, human-like characters intended for 3D immersive environments and video games is also being investigated by Tinwell et al., 2011.[18] Building on the body of work already undertaken in android science, this research intends to build a conceptual framework of the Uncanny Valley using 3D characters generated in a real-time gaming engine; analysing how cross modal factors of facial expression, and speech may exaggerate the uncanny.

Design principles

A number of design principles have been proposed for avoiding the uncanny valley:

  • Design elements should match in human realism. For a robot to give a more positive impression, its degree of human realism in appearance should match its degree of human realism in behavior.[19] If an animated character looks more human than its movement, this gives a negative impression.[20] A robot may look uncanny when human and nonhuman elements are mixed.[21] Human brain activation patterns also seem to indicate matching appearance and motion kinematic is important.[22]
  • Appearance and behavior should match ability. In terms of performance, if a robot looks too appliance-like, people will expect little from it; if it looks too human, people will expect too much from it.[19]
  • Human facial proportions and photorealistic texture should only be used together. A photorealistic human texture demands human facial proportions, or the computer generated character can fall into the uncanny valley. Abnormal facial proportions, including those typically used by artists to enhance attractiveness (e.g., larger eyes), can look eerie with a photorealistic human texture. Avoiding a photorealistic texture can permit more leeway.[23]

Criticism

A number of criticisms have been raised concerning whether the uncanny valley exists as a unified phenomenon amenable to scientific scrutiny:

  • Good design can lift human-looking entities out of the valley. David Hanson has criticized Mori's hypothesis that entities approaching human appearance will necessarily be evaluated negatively.[24] He has shown that the uncanny valley that Karl MacDorman and Hiroshi Ishiguro[25] generated – by having participants rate photographs that morphed from humanoid robots to android robots to human beings – could be flattened out by adding neotenous, cartoonish features to the entities that had formerly fallen into the valley.[24]
  • The uncanny appears at any degree of human likeness. Hanson has also pointed out that uncanny entities may appear anywhere in a spectrum ranging from the abstract (e.g., MIT's robot Lazlo) to the perfectly human (e.g., cosmetically atypical people).[24] Capgras syndrome is a relatively rare condition in which the sufferer believes that people (or, in some cases, things) have been replaced with duplicates. These duplicates are rationally accepted to be identical in physical properties, but the irrational belief is held that the "true" entity has been replaced with something else. Some sufferers of Capgras syndrome claim that the duplicate is a robot. Ellis and Lewis argue that the syndrome arises from an intact system for overt recognition coupled with a damaged system for covert recognition, which leads to conflict over an individual being identifiable but not familiar in any emotional sense.[26] This supports the view that the uncanny valley could arise due to issues of categorical perception that are particular to the manner in which the social brain processes information.[27]
  • The uncanny valley is a heterogeneous group of phenomena. Phenomena labeled as being in the uncanny valley can be diverse, involve different sense modalities, and have multiple, possibly overlapping causes, which can range from evolved or learned circuits for early face perception[23][28] to culturally-shared psychological constructs.[29] People's cultural backgrounds may have a considerable influence on how androids are perceived with respect to the uncanny valley.[30]

History

An effect similar to the uncanny valley was noted by Charles Darwin in 1839:

The expression of this [Trigonocephalus] snake’s face was hideous and fierce; the pupil consisted of a vertical slit in a mottled and coppery iris; the jaws were broad at the base, and the nose terminated in a triangular projection. I do not think I ever saw anything more ugly, excepting, perhaps, some of the vampire bats. I imagine this repulsive aspect originates from the features being placed in positions, with respect to each other, somewhat proportional to the human face; and thus we obtain a scale of hideousness.

— Charles Darwin, The Voyage of the Beagle[31]

Transhumanism

According to writer Jamais Cascio, a similar "uncanny valley" effect could show up when humans begin modifying themselves with transhuman enhancements (cf. body modification), which aim to improve the abilities of the human body beyond what would normally be possible, be it eyesight, muscle strength, or cognition.[32] So long as these enhancements remain within a perceived norm of human behavior, a negative reaction is unlikely, but once individuals supplant normal human variety, revulsion can be expected. However, according to this theory, once such technologies gain further distance from human norms, "transhuman" individuals would cease to be judged on human levels and instead be regarded as separate entities altogether (this point is what has been dubbed "posthuman"), and it is here that acceptance would rise once again out of the uncanny valley.[32] Another example comes from "pageant retouching" photos, especially of children, which some[33] find disturbingly doll-like.

Film and television

Roboticist Dario Floreano stated that the concept of the uncanny valley is taken seriously by the film industry due to negative audience reactions to the animated baby in Pixar's 1988 short film Tin Toy.[34][35] The 2004 CGI animated film The Polar Express as well as the 2007 CGI animated film Beowulf were criticized by reviewers who felt that the appearances of the characters were "creepy" or "eerie". More recently the 2010 film Tron: Legacy was also heavily criticized by viewers who clamied that the CGI rendering of Clu 2.0 was "appalling".[36]

In the 2008 30 Rock episode "Succession", Frank Rossitano explains the uncanny valley concept, using a graph and Star Wars examples, to try to convince Tracy Jordan that his dream of creating a pornographic video game is impossible. He also references The Polar Express.[37]

In the Doctor Who episode The Robots of Death, a similar concept is referred to as "Grimwade's Syndrome" which is described as a psychological condition among people with frequent contact with robots, attributed to the robots moving like humans, but without any of the characteristic human body language. In the mind of those afflicted, they appear to be, in the words of the Doctor, "surrounded by walking, talking dead men."

In Star Trek: The Next Generation, episode Datalore, Lore initially claims that he was created after Data, saying when Picard asks who was created first, "He was, but they found him to be imperfect, and I was made to replace him." Later in the episode Lore reveals to Data that, in fact, Lore was created first. He explains that he was so close to human that the human colonists asked his creator to create a less-perfect android, Data. (However, Lore is lying, and in a later episode the creator of both androids, Dr. Noonian Soong, tells Data that he is not less perfect than Lore, and says that Lore was deactivated because he was unstable.)

The Tachikomas in the anime television series Ghost in the Shell: Stand Alone Complex worry that the Major dislikes them because of their increasingly human-like personalities, in spite of their tank-like outward appearance, and try to regain her favor by acting more like robots.

Episode 12 (season 5) of Criminal Minds is titled "The Uncanny Valley" and explores the theme through the lens of a serial abductress (and murderess) who chemically paralyzes the women she abducts and treats them like dolls.

In the season 6 episode of Red Dwarf, "Out of Time", Kryten mentions the Uncanny Valley (though not by name) as the reason his faceted head was designed to look so inhuman. The series of Mechanoids that preceded his were hyper-realistic, and people's natural revulsion to such realistic-appearing machines severely hurt their sales.

See also

Notes

  1. ^ http://www.popularmechanics.com/science/robotics/4343054.html
  2. ^ Jentsch, E. (25 Aug. 1906). Zur Psychologie des Unheimlichen, Psychiatrisch-Neurologische Wochenschrift 8(22), 195-198.
  3. ^ Freud, S. (1919/2003). The uncanny [das unheimliche] (D. McLintock, Trans.). New York: Penguin.
  4. ^ When fantasy is just too close for comfort - The Age, June 10, 2007
  5. ^ Digital Actors in ‘Beowulf’ Are Just Uncanny - New York Times, November 14, 2007
  6. ^ MacDorman, 2005.
  7. ^ a b Mori, Masahiro (1970). Bukimi no tani The uncanny valley (K. F. MacDorman & T. Minato, Trans.). Energy, 7(4), 33–35. (Originally in Japanese)
  8. ^ Green, MacDorman, Ho, Koch, 2008.
  9. ^ a b Rhodes, G. & Zebrowitz, L. A. (eds) (2002). Facial Attractiveness: Evolutionary, Cognitive, and Social Perspectives, Ablex Publishing.
  10. ^ MacDorman & Ishiguro, 2006, p. 313.
  11. ^ MacDorman, Green, Ho, & Koch, 2009, p. 696.
  12. ^ Ramey, 2005.
  13. ^ MacDorman & Ishiguro, 2006, p. 303.
  14. ^ MacDorman, K. F., Vasudevan, S. K., & Ho, C.-C., 2009.
  15. ^ Yalom, Irvin D. (1980) ”Existential Psychotherapy”, Basic Books, Inc., Publishers, New York
  16. ^ Monkey visual behavior falls into the uncanny valley
  17. ^ http://ucsdnews.ucsd.edu/newsrel/science/05-13ResearchOpportunities.asp
  18. ^ http://digitalcommons.bolton.ac.uk/cgi/viewcontent.cgi?article=1013&context=gcct_journalspr
  19. ^ a b Goetz, Kiesler, & Powers, 2003.
  20. ^ Vinayagamoorthy, Steed, & Slater, 2005.
  21. ^ Ho, MacDorman, Pramono, 2008.
  22. ^ Saygin, Chaminade & Ishiguro 2010
  23. ^ a b MacDorman, Green, Ho, & Koch, 2009.
  24. ^ a b c Hanson, 2005.
  25. ^ MacDorman & Ishiguro, 2006, p. 305.
  26. ^ Ellis, H., & Lewis, M. (2001). Capgras delusion: A window on face recognition. Trends in Cognitive Science, 5(4), 149-156.
  27. ^ Pollick, F. In Search of the Uncanny Valley. Analog communication: Evolution, brain mechanisms, dynamics, simulation. Cambridge, MA: MIT Press: The Vienna Series in Theoretical Biology (2009)
  28. ^ MacDorman & Ishiguro, 2006
  29. ^ MacDorman, Vasudevan & Ho, 2008.
  30. ^ Bartneck Kanda, Ishiguro, & Hagita, 2007.
  31. ^ Charles Darwin. The Voyage of the Beagle . New York: Modern Library. 2001. p. 87.
  32. ^ a b Jamais Cascio, The Second Uncanny Valley
  33. ^ Pageant retouching
  34. ^ Dario Floreano. Bio-Mimetic Robotics
  35. ^ EPFL. [1]
  36. ^ Loder, Kurt (2004-11-10). "'The Polar Express' Is All Too Human". MTV. Retrieved 2007-12-14.
  37. ^ Michael Neal (April 25, 2008). "Succession". Yahoo! TV.

References