Social robot

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

A social robot is an autonomous robot that interacts and communicates with humans or other autonomous physical agents by following social behaviors and rules attached to its role. Like other robots, a social robot is physically embodied (avatars or on-screen synthetic social characters are not embodied and thus distinct). Some synthetic social agents are designed with a screen to represent the head or 'face' to dynamically communicate with users. In these cases, the status as a social robot depends on the form of the 'body' of the social agent; if the body has and uses some physical motors and sensor abilities, then the system could be considered a robot.


While robots have often been described as possessing social qualities (see for example the tortoises developed by William Grey Walter in the 1950s), social robotics is a fairly recent branch of robotics. Since the early 1990s artificial intelligence and robotics researchers have developed robots which explicitly engage on a social level. Notable researchers include Cynthia Breazeal, Tony Belpaeme, Aude Billard, Kerstin Dautenhahn, Yiannis Demiris, Hiroshi Ishiguro, Maja Mataric, Javier Movellan, Brian Scassellati and Dean Weber. Also related is the Kansai engineering movement in Japanese science and technology --- for social robotics, see especially work by Takayuki Kanda, Hideki Kozima, Hiroshi Ishiguro, Micho Okada, Tomio Watanabe, and P. Ravindra S. De Silva.

Designing an autonomous social robot is particularly challenging, as the robot needs to correctly interpret people's action and respond appropriately, which is currently not yet possible. Moreover, people interacting with a social robot may hold very high expectancies of its capabilities, based on science fiction representations of advanced social robots. As such, many social robots are partially or fully remote controlled to simulate advanced capabilities. This method of (often covertly) controlling a social robot is referred to as a Mechanical Turk or Wizard of Oz, after the character in the L. Frank Baum book. Wizard of Oz studies are useful in social robotics research to evaluate how people respond to social robots.


A robot is defined in the International Standard of Organization as a reprogrammable, multifunctional manipulator designed to move material, parts, tools or specialized devices through variable programmed motions for performance of a variety of tasks. As a subset of robots, social robots perform any or all of these processes in the context of a social interaction. The nature of the social interactions is immaterial and may range from relatively simple supportive tasks, such as passing tools to a worker, to complex expressive communication and collaboration, such as assistive healthcare. Hence, social robots are asked to work together with humans in collaborative workspaces. Moreover, social robots start following humans into much more personal settings like home, health care, and education.

Social interactions are likely to be cooperative, but the definition is not limited to this situation. Moreover, uncooperative behavior can be considered social in certain situations. The robot could, for example, exhibit competitive behavior within the framework of a game. The robot could also interact with a minimum or no communication. It could, for example, hand tools to an astronaut working on a space station. However, it is likely that some communication will be necessary at some point.

Two suggested[by whom?] ultimate requirements for social robots are the Turing Test to determine the robot's communication skills and Isaac Asimov's Three Laws of Robotics for its behavior (The usefulness to apply these requirements in a real-world application, especially in the case of Asimov's laws, still is disputed[by whom?] and may not be possible at all). However, a consequence of this viewpoint is that a robot that only interacts and communicates with other robots would not be considered to be a social robot: Being social is bound to humans and their society which defines necessary social values, norms and standards.[1] This results in a cultural dependency of social robots since social values, norms and standards differ between cultures.

This brings us directly to the last part of the definition. A social robot must interact within the social rules attached to its role. The role and its rules are defined through society. For example, a robotic butler for humans would have to comply with established rules of good service. It should be anticipating, reliable and most of all discreet. A social robot must be aware of this and comply with it. However, social robots that interact with other autonomous robots would also behave and interact according to non-human conventions. In most social robots the complexity of human-to-human interaction will be gradually approached with the advancement of the technology of androids (a form of humanoid robots) and implementation of a variety of more human-like communication skills [2]

An example of social robot would be Kaspar, designed by University of Hertfordshire to help children with autism learn responses from the robot through games and interactive play.[3]

Social Interaction[edit]

Researches have investigated user engagement with a robot companion. Literature present different models regarding this concern. An example is a framework that models both causes and effects of engagement: features related to the user's non-verbal behaviour, the task and the companion's affective reactions to predict children's level of engagement.[4]

Societal Impacts[edit]

The increasingly widespread use of more advanced social robots is one of several phenomena expected to contribute to the technological posthumanization of human societies, through which process “a society comes to include members other than ‘natural’ biological human beings who, in one way or another, contribute to the structures, dynamics, or meaning of the society.”[5]

See also[edit]

Further references[edit]

  • Walter, W. Grey (May 1950). "An Imitation of Life". Scientific American. pp. 42–45.
  • Dautenhahn, Kerstin (1994). Gaussier, P.; Nicoud, J. D. (eds.). "Trying to Imitate - a Step Towards Releasing Robots from Social Isolation". Proceedings: From Perception to Action Conference. Lausanne, Switzerland: IEEE Computer Society Press: 290–301. ISBN 0-8186-6482-7.
  • Dautenhahn, Kerstin (1995). "Getting to know each other - artificial social intelligence for autonomous robots". Robotics and Autonomous Systems. 16 (2–4): 333–356. doi:10.1016/0921-8890(95)00054-2.
  • Breazeal, Cynthia L. (2002). Designing Sociable Robots. MIT Press. ISBN 0-262-02510-8.
  • Fong, Terrence; Nourbakhsh, Illah R.; Dautenhahn, Kerstin (2003). "A survey of socially interactive robots". Robotics and Autonomous Systems. 42 (3–4): 143–166. doi:10.1016/S0921-8890(02)00372-X.
  • Duffy, Brian R. (2004). "Social Embodiment in Autonomous Mobile Robotics". International Journal of Advanced Robotic Systems. 1 (3): 155–170.
  • Duffy, Brian R. (2008). "Fundamental Issues in Affective Intelligent Social Machines". Open Artificial Intelligence Journal. 2 (14): 21–34. doi:10.2174/1874061800802010021. ISSN 1874-0618.


  1. ^ Taipale, S., Vincent, J., Sapio, B., Lugano, G. and Fortunati, L. (2015) Introduction: Situating the Human in Social Robots. In J. Vincent et al., eds. Social Robots from a Human Perspective, Dordrecht: Springer, pp. 1-17
  2. ^ "Implications of interpersonal communication competence research on the design of artificial behavioral systems that interact with humans". Retrieved 3 March 2017.
  3. ^ "Robot at Hertfordshire University aids autistic children". BBC. Retrieved 28 Sep 2014.
  4. ^ Castellano, Ginevra; Pereira, André; Leite, Iolanda; Paiva, Ana; McOwan, Peter W. (2009). "Detecting user engagement with a robot companion using task and social interaction-based features". Proceedings of the 2009 International Conference on Multimodal Interfaces - ICMI-MLMI '09. Cambridge, Massachusetts, USA: ACM Press: 119. doi:10.1145/1647314.1647336. ISBN 9781605587721.
  5. ^ Gladden, Matthew (2018). Sapient Circuits and Digitalized Flesh: The Organization as Locus of Technological Posthumanization (second ed.). Indianapolis, IN: Defragmenter Media. p. 19. ISBN 978-1-944373-21-4.

External links[edit]