Social robot

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
Quori, a socially interactive robot platform for studying human-robot interaction, Immersive Kinematics Lab

A social robot is an autonomous robot that interacts and communicates with humans or other autonomous physical agents by following social behaviors and rules attached to its role. Like other robots, a social robot is physically embodied (avatars or on-screen synthetic social characters are not embodied and thus distinct). Some synthetic social agents are designed with a screen to represent the head or 'face' to dynamically communicate with users. In these cases, the status as a social robot depends on the form of the 'body' of the social agent; if the body has and uses some physical motors and sensor abilities, then the system could be considered a robot.

Background[edit]

While robots have often been described as possessing social qualities (see for example the tortoises developed by William Grey Walter in the 1950s), social robotics is a fairly recent branch of robotics. Since the early 1990s artificial intelligence and robotics researchers have developed robots which explicitly engage on a social level. Notable researchers include Cynthia Breazeal, Tony Belpaeme, Aude Billard, Kerstin Dautenhahn, Yiannis Demiris, Hiroshi Ishiguro, Maja Mataric, Javier Movellan, Brian Scassellati and Dean Weber. Also related is the Kansai engineering movement in Japanese science and technology --- for social robotics, see especially work by Takayuki Kanda, Hideki Kozima, Hiroshi Ishiguro, Micho Okada, Tomio Watanabe, and P. Ravindra S. De Silva.

Designing an autonomous social robot is particularly challenging, as the robot needs to correctly interpret people's action and respond appropriately, which is currently not yet possible. Moreover, people interacting with a social robot may hold very high expectancies of its capabilities, based on science fiction representations of advanced social robots. As such, many social robots are partially or fully remote controlled to simulate advanced capabilities. This method of (often covertly) controlling a social robot is referred to as a Mechanical Turk or Wizard of Oz, after the character in the L. Frank Baum book. Wizard of Oz studies are useful in social robotics research to evaluate how people respond to social robots.

Definition[edit]

A robot is defined in the International Standard of Organization as a reprogrammable, multifunctional manipulator designed to move material, parts, tools or specialized devices through variable programmed motions for performance of a variety of tasks. As a subset of robots, social robots perform any or all of these processes in the context of a social interaction. It interacts socially with humans or evokes social responses from them.[1] The nature of the social interactions is immaterial and may range from relatively simple supportive tasks, such as passing tools to a worker, to complex expressive communication and collaboration, such as assistive healthcare. Hence, social robots are asked to work together with humans in collaborative workspaces. Moreover, social robots start following humans into much more personal settings like home, health care, and education.[2]

Social interactions are likely to be cooperative, but the definition is not limited to this situation. Moreover, uncooperative behavior can be considered social in certain situations. The robot could, for example, exhibit competitive behavior within the framework of a game. The robot could also interact with a minimum or no communication. It could, for example, hand tools to an astronaut working on a space station. However, it is likely that some communication will be necessary at some point.

Two suggested [3] ultimate requirements for social robots are the Turing Test to determine the robot's communication skills and Isaac Asimov's Three Laws of Robotics for its behavior. The usefulness to apply these requirements in a real-world application, especially in the case of Asimov's laws, is still disputed[4] and may not be possible at all). However, a consequence of this viewpoint is that a robot that only interacts and communicates with other robots would not be considered to be a social robot: Being social is bound to humans and their society which defines necessary social values, norms and standards.[5] This results in a cultural dependency of social robots since social values, norms and standards differ between cultures.

This brings us directly to the last part of the definition. A social robot must interact within the social rules attached to its role. The role and its rules are defined through society. For example, a robotic butler for humans would have to comply with established rules of good service. It should be anticipating, reliable and most of all discreet. A social robot must be aware of this and comply with it. However, social robots that interact with other autonomous robots would also behave and interact according to non-human conventions. In most social robots the complexity of human-to-human interaction will be gradually approached with the advancement of the technology of androids (a form of humanoid robots) and implementation of a variety of more human-like communication skills [6]

Social Interaction[edit]

Researches have investigated user engagement with a robot companion. Literature present different models regarding this concern. An example is a framework that models both causes and effects of engagement: features related to the user's non-verbal behaviour, the task and the companion's affective reactions to predict children's level of engagement.[7]

Many people are uneasy about interacting socially with a robot and, in general, people tend to prefer smaller robots to large humanoid robots. They also prefer robots to do tasks like cleaning the house rather than providing companionship.[8] Despite initial reluctance to interact with social robots, exposure to a social robot may decrease uncertainty and increase willingness to interact with the robot.[9] If people have an interaction with a social robot that is seen as playful (as opposed to focused on completing a task or being social) they may be more likely to engage with the robot in the future.[10]

Societal Impacts[edit]

The increasingly widespread use of more advanced social robots is one of several phenomena expected to contribute to the technological posthumanization of human societies, through which process “a society comes to include members other than ‘natural’ biological human beings who, in one way or another, contribute to the structures, dynamics, or meaning of the society.”[11]

Uses in Healthcare[edit]

Social robots have been used increasingly in healthcare settings and recent research has been exploring the applicability of social robots as mental health interventions for children.[12] A scoping review analyzed the impacts that robots such as Nao, Paro, Huggable, Tega and Pleo have on children in various intervention settings.[12] Results from this work highlighted that depression and anger may be reduced in children working with social robots, however anxiety and pain yielded mixed results.[12] Distress was found to be reduced in children who interacted with robots.[12] Finally, this scoping review found that affect was positively impacted by interaction with robots--such that children smiled for longer and demonstrated growth-mindsets when playing games.[12] It is worth noting that robots have increased benefits in that they can be used instead of animal-assisted therapy for children who are allergic or immunocompromised.[12] Sanitation is a necessary issue to consider, however with washable covers or sanitizable surfaces, this becomes less of a problem in medical settings.[12] Another review analyzed data from previous studies and found further support that social robots may reduce negative symptoms children experience in healthcare settings.[13] Social robots can be used as tools for distracting children from procedures, like getting a shot, and have demonstrated the ability to reduce stress and pain experience.[13] Children who interacted with both a psychotherapist and robot assistant for therapy experienced reduced anger, anxiety, and depression when coping with cancer compared to a control group.[13] There is some evidence that supports that free-play with a robot while hospitalized can help children experience more positive moods.[13] More work needs to be done to analyze the impact of social robots on children in psychiatric wards, as evidence revealed that some children may dislike the robot and feel it is dangerous.[13] Overall, further research should be conducted to fully understand the impact of social robots on reducing negative mental health symptoms in children, but there appears to be advantages of utilizing social robots in healthcare settings.[12][13]

Social robots have been shown to have beneficial outcomes for children with Autism Spectrum Disorder (ASD).[14] As many individuals with Autism Spectrum Disorder tend to prefer predictable interactions, robots may be a viable option for social interactions.[14] Previous research on the interactions between children with ASD and robots has demonstrated positive benefits, for instance shared attention, increased eye contact, and interpersonal synchrony.[14] Various types of robots have the potential to reap these benefits for children with ASD--from humanoid robots like KASPAR, to cartoonish robots such as Tito, to animal-like robots like Probo, to machine-like robots such as Nao.[14] One problem that may hinder the advantages of social robots as social interaction tools for children with ASD is the Uncanny Valley, as the eerily human-likeness of the robots may be overstimulating and anxiety-inducing, as one study found with Keepon.[14] It appears that social robots provide an opportunity to increase social skills in children with ASD, and future research should investigate this topic further.

Individuals with cognitive impairments, such as Dementia and Alzheimer's Disease, may also benefit from social robots.[15][16] In their study, Moro et al. (2018) utilized 3 social robots types--a human-like robot, Casper; a character-like Robot, The Ed robot; and a tablet--to help six individuals with Mild Cognitive Impairment make a cup of tea.[15] Results demonstrated that, to an extent, the humanoid robot was most engaging to individuals with cognitive impairments, likely due to the expressiveness of its face compared to the minimal expression of Ed and the tablet.[15] Participants also anthropomorphized the human-like and character-like robot more so than the tablet by addressing them and asking questions, further indicating a preference for the social robots.[15] Additionally, participants perceived the human-like robot to be useful in both social situations and in completing activities of daily living, whereas the character-like robot and tablet were seen as only useful for activities of daily living.[15] Another study by Moyle et al. (2019) investigated the impact that providing an individual with dementia a robot toy, Paro, versus a plush-toy would have on caregiver and family members' perception of the individuals' well-being.[16] This study highlighted the ways in which some long-term care facilities may have minimal stimulation for dementia patients, which can lead to boredom and increased agitation.[16] After completing the trial, caregivers and family members were asked to assess the individual with dementias' well-being and, overall, the group that interacted with Paro was perceived to be happier, more engaged, and less agitated.[16] One of the main issues with utilizing Paro, despite its perceived benefits, is the cost--future research must investigate more cost effective options for older adult care.[16] Another issue of conducting research between individuals with cognitive impairments and social robots is their ability to consent.[17] In some cases, informed consent by proxy can be utilized, however the benefits and risks should be weighed before conducting any research.[17] Long-term research could show that residents of care home are willing to interact with humanoid robots and benefit from cognitive and physical activation that is led by the robot Pepper. [18]

The ethics of social robots' use in healthcare should also be mentioned. One potential risk of social robots is deception--there may be an expectation that the robot can perform certain functions when it actually cannot.[19] For example, with increased human-likeness and anthropomorphic traits, humans interacting with robots might assume the robot to have feelings and thoughts, which is misleading.[19] Isolating older adults from humans is also a risk of social robots in that these robots may make up a significant amount of the individual's social interaction.[19] Currently there is little evidence about the long-term impacts this limited human contact and increased robot interaction may have.[19] Some social robots also have a built in telepresence capacity that can be utilized such that individuals can videoconference with family, caregivers, and medical staff, which may decrease loneliness and isolation.[20] The video capability of some robots is a potential avenue for social interaction and increasing accessibility of medical assessments.[20] Dignity for persons interacting with robots should also be respected--individuals might find some robots, like the cuddly toy-like Paro, to be infantilizing, and future investigations should explore how to best increase autonomy of patients interacting with robots.[19] Furthermore, privacy is another ethical concern in that some social robots can collect and store video data or data from sensors.[19] The stored data is at risk to be stolen or hacked into, which negatively impacts individual privacy.[19] Safety of individuals interacting with robots is another concern in that robots may accidentally cause harm, like by bumping into someone and causing them to fall.[19] Ethical considerations should be taken into account before introducing robots into healthcare settings.

Examples[edit]

One of the most well-known social robots currently in development is Sophia, developed by Hanson Robotics. Sophia is a social humanoid robot that can display more than 50 facial expressions, and is the first non-human to be given a United Nations title.

SoftBank Robotics has developed multiple social, semi-humanoid robots which are frequently used in research, including Pepper and Nao. Pepper is used both commercially and academically, as well as being used by consumers in over a thousand homes in Japan.

Other notable examples of social robots include ASIMO by Honda, Jibo, Moxi, and Kaspar, designed by University of Hertfordshire to help children with autism learn responses from the robot through games and interactive play.[21] Anki's robots Cozmo and Vector also fell into the category of social robots, but all were shut down between 2018 and 2019.

Social robots do not necessarily have to be humanoid. The most famous example of a non-humanoid social robot is Paro the seal.

See also[edit]

Further references[edit]

  • Walter, W. Grey (May 1950). "An Imitation of Life". Scientific American. pp. 42–45.
  • Dautenhahn, Kerstin (1994). Gaussier, P.; Nicoud, J. D. (eds.). "Trying to Imitate - a Step Towards Releasing Robots from Social Isolation". Proceedings: From Perception to Action Conference. Lausanne, Switzerland: IEEE Computer Society Press: 290–301. doi:10.1109/FPA.1994.636112. ISBN 0-8186-6482-7. S2CID 152231331.
  • Dautenhahn, Kerstin (1995). "Getting to know each other - artificial social intelligence for autonomous robots". Robotics and Autonomous Systems. 16 (2–4): 333–356. doi:10.1016/0921-8890(95)00054-2.
  • Breazeal, Cynthia L. (2002). Designing Sociable Robots. MIT Press. ISBN 0-262-02510-8.
  • Fong, Terrence; Nourbakhsh, Illah R.; Dautenhahn, Kerstin (2003). "A survey of socially interactive robots". Robotics and Autonomous Systems. 42 (3–4): 143–166. doi:10.1016/S0921-8890(02)00372-X.

References[edit]

  1. ^ Leite, Iolanda; Martinho, Carlos; Paiva, Ana (April 2013). "Social Robots for Long-Term Interaction: A Survey". International Journal of Social Robotics. 5 (2): 291–308. doi:10.1007/s12369-013-0178-y. ISSN 1875-4791. S2CID 3721600.
  2. ^ Lin, Chaolan; Šabanović, Selma; Dombrowski, Lynn; Miller, Andrew D.; Brady, Erin; MacDorman, Karl F. (2021). "Parental Acceptance of Children's Storytelling Robots: A Projection of the Uncanny Valley of AI". Frontiers in Robotics and AI. 8 (579993): 1–15. doi:10.3389/frobt.2021.579993. ISSN 2296-9144.
  3. ^ David Feil-Seifer, Kristine Skinner and Maja J. Matarić, "Benchmarks for evaluating socially assistive robotics", Interaction Studies: Psychological Benchmarks of Human-Robot Inteaction [sic], 8(3), 423-429 Oct, 2007
  4. ^ Towards Data Science: Asimov’s Laws of Robotics, and why AI may not abide by them
  5. ^ Taipale, S., Vincent, J., Sapio, B., Lugano, G. and Fortunati, L. (2015) Introduction: Situating the Human in Social Robots. In J. Vincent et al., eds. Social Robots from a Human Perspective, Dordrecht: Springer, pp. 1-17
  6. ^ "Implications of interpersonal communication competence research on the design of artificial behavioral systems that interact with humans". Retrieved 3 March 2017.
  7. ^ Castellano, Ginevra; Pereira, André; Leite, Iolanda; Paiva, Ana; McOwan, Peter W. (2009). "Detecting user engagement with a robot companion using task and social interaction-based features". Proceedings of the 2009 International Conference on Multimodal Interfaces - ICMI-MLMI '09. Cambridge, Massachusetts, USA: ACM Press: 119. doi:10.1145/1647314.1647336. ISBN 9781605587721. S2CID 3358106.
  8. ^ Ray, Celine; Mondada, Francesco; Siegwart, Roland (September 2008). "What do people expect from robots?". 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems: 3816–3821. doi:10.1109/IROS.2008.4650714. ISBN 978-1-4244-2057-5. S2CID 9253964.
  9. ^ Haggadone, Brad A.; Banks, Jaime; Koban, Kevin (2021-04-07). "Of robots and robotkind: Extending intergroup contact theory to social machines". Communication Research Reports. 0: 1–11. doi:10.1080/08824096.2021.1909551.
  10. ^ Banks, Jaime; Koban, Kevin; Chauveau, Philippe (2021-04-15). "Forms and Frames: Mind, Morality, and Trust in Robots across Prototypical Interactions". Human-Machine Communication. 2 (1): 81–103. doi:10.30658/hmc.2.4.
  11. ^ Gladden, Matthew (2018). Sapient Circuits and Digitalized Flesh: The Organization as Locus of Technological Posthumanization (second ed.). Indianapolis, IN: Defragmenter Media. p. 19. ISBN 978-1-944373-21-4.
  12. ^ a b c d e f g h Kabacińska, Katarzyna; Prescott, Tony J.; Robillard, Julie M. (2020-07-27). "Socially Assistive Robots as Mental Health Interventions for Children: A Scoping Review". International Journal of Social Robotics. doi:10.1007/s12369-020-00679-0. ISSN 1875-4791.
  13. ^ a b c d e f Moerman, Clara J; van der Heide, Loek; Heerink, Marcel (December 2019). "Social robots to support children's well-being under medical treatment: A systematic state-of-the-art review". Journal of Child Health Care. 23 (4): 596–612. doi:10.1177/1367493518803031. ISSN 1367-4935. PMID 30394806. S2CID 53219310.
  14. ^ a b c d e Sartorato, Felippe; Przybylowski, Leon; Sarko, Diana K. (July 2017). "Improving therapeutic outcomes in autism spectrum disorders: Enhancing social communication and sensory processing through the use of interactive robots". Journal of Psychiatric Research. 90: 1–11. doi:10.1016/j.jpsychires.2017.02.004. PMID 28213292.
  15. ^ a b c d e Moro, Christina; Lin, Shayne; Nejat, Goldie; Mihailidis, Alex (2019-01-01). "Social Robots and Seniors: A Comparative Study on the Influence of Dynamic Social Features on Human–Robot Interaction". International Journal of Social Robotics. 11 (1): 5–24. doi:10.1007/s12369-018-0488-1. ISSN 1875-4805. S2CID 68237859.
  16. ^ a b c d e Moyle, Wendy; Bramble, Marguerite; Jones, Cindy J; Murfield, Jenny E (2017-11-19). ""She Had a Smile on Her Face as Wide as the Great Australian Bite": A Qualitative Examination of Family Perceptions of a Therapeutic Robot and a Plush Toy". The Gerontologist. 59 (1): 177–185. doi:10.1093/geront/gnx180. hdl:10072/375764. ISSN 0016-9013. PMID 29165558.
  17. ^ a b Körtner, T. (June 2016). "Ethical challenges in the use of social service robots for elderly people". Zeitschrift für Gerontologie und Geriatrie. 49 (4): 303–307. doi:10.1007/s00391-016-1066-5. ISSN 0948-6704. PMID 27220734. S2CID 20690764.
  18. ^ Carros, Felix; Meurer, Johanna; Löffler, Diana; Unbehaun, David; Matthies, Sarah; Koch, Inga; Wieching, Rainer; Randall, Dave; Hassenzahl, Marc; Wulf, Volker (21 April 2020). "Exploring Human-Robot Interaction with the Elderly: Results from a Ten-Week Case Study in a Care Home". Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems: 1–12. doi:10.1145/3313831.3376402.
  19. ^ a b c d e f g h Körtner, T. (June 2016). "Ethical challenges in the use of social service robots for elderly people". Zeitschrift für Gerontologie und Geriatrie. 49 (4): 303–307. doi:10.1007/s00391-016-1066-5. ISSN 0948-6704. PMID 27220734. S2CID 20690764.
  20. ^ a b Moyle, Wendy; Arnautovska, Urska; Ownsworth, Tamara; Jones, Cindy (December 2017). "Potential of telepresence robots to enhance social connectedness in older adults with dementia: an integrative review of feasibility". International Psychogeriatrics. 29 (12): 1951–1964. doi:10.1017/S1041610217001776. hdl:10072/376115. ISSN 1041-6102. PMID 28879828. S2CID 22545504.
  21. ^ "Robot at Hertfordshire University aids autistic children". BBC. Retrieved 28 Sep 2014.

External links[edit]