Human–robot interaction

From Wikipedia, the free encyclopedia
  (Redirected from Human-robot interaction)
Jump to navigation Jump to search

Human–robot interaction is the study of interactions between humans and robots. It is often referred as HRI by researchers. Human–robot interaction is a multidisciplinary field with contributions from human–computer interaction, artificial intelligence, robotics, natural language understanding, design, and social sciences.


Human–robot interaction has been a topic of both science fiction and academic speculation even before any robots existed. Because HRI depends on a knowledge of (sometimes natural) human communication, many aspects of HRI are continuations of human communications topics that are much older than robotics.

The origin of HRI as a discrete problem was stated by 20th-century author Isaac Asimov in 1941, in his novel I, Robot. He states the Three Laws of Robotics as,

These three laws of robotics determine the idea of safe interaction. The closer the human and the robot get and the more intricate the relationship becomes, the more the risk of a human being injured rises. Nowadays in advanced societies, manufacturers employing robots solve this issue by not letting humans and robots share the workspace at any time. This is achieved by defining safe zones using lidar sensors or physical cages. Thus the presence of humans is completely forbidden in the robot workspace while it is working.

With the advances of artificial intelligence, the autonomous robots could eventually have more proactive behaviors, planning their motion in complex unknown environments. These new capabilities keep safety as the primary issue and efficiency as secondary. To allow this new generation of robot, research is being conducted on human detection, motion planning, scene reconstruction, intelligent behavior through task planning and compliant behavior using force control (impedance or admittance control schemes).

The goal of HRI research is to define models of humans' expectations regarding robot interaction to guide robot design and algorithmic development that would allow more natural and effective interaction between humans and robots. Research ranges from how humans work with remote, tele-operated unmanned vehicles to peer-to-peer collaboration with anthropomorphic robots.

Many in the field of HRI study how humans collaborate and interact and use those studies to motivate how robots should interact with humans.

The goal of friendly human–robot interactions[edit]

Kismet can produce a range of facial expressions.

Robots are artificial agents with capacities of perception and action in the physical world often referred by researchers as workspace. Their use has been generalized in factories but nowadays they tend to be found in the most technologically advanced societies in such critical domains as search and rescue, military battle, mine and bomb detection, scientific exploration, law enforcement, entertainment and hospital care.

These new domains of applications imply a closer interaction with the user. The concept of closeness is to be taken in its full meaning, robots and humans share the workspace but also share goals in terms of task achievement. This close interaction needs new theoretical models, on one hand for the robotics scientists who work to improve the robots utility and on the other hand to evaluate the risks and benefits of this new "friend" for our modern society.

With the advance in AI, the research is focusing on one part towards the safest physical interaction but also on a socially correct interaction, dependent on cultural criteria. The goal is to build an intuitive, and easy communication with the robot through speech, gestures, and facial expressions.

Dautenhahn refers to friendly Human–robot interaction as "Robotiquette" defining it as the "social rules for robot behaviour (a ‘robotiquette’) that is comfortable and acceptable to humans"[1] The robot has to adapt itself to our way of expressing desires and orders and not the contrary. But every day environments such as homes have much more complex social rules than those implied by factories or even military environments. Thus, the robot needs perceiving and understanding capacities to build dynamic models of its surroundings. It needs to categorize objects, recognize and locate humans and further their emotions. The need for dynamic capacities pushes forward every sub-field of robotics.

Furthermore, by understanding and perceiving social cues, robots can enable collaborative scenarios with humans. For example, with the rapid rise of personal fabrication machines such as desktop 3d printers, laser cutters, etc., entering our homes, scenarios may arise where robots can collaboratively share control, co-ordinate and achieve tasks together. Industrial robots have already been integrated into industrial assembly lines and are collaboratively working with humans. The social impact of such robots have been studied [2] and has indicated that workers still treat robots and social entities, rely on social cues to understand and work together.

On the other end of HRI research the cognitive modelling of the "relationship" between human and the robots benefits the psychologists and robotic researchers the user study are often of interests on both sides. This research endeavours part of human society. For effective human – humanoid robot interaction[3] numerous communication skills[4] and related features should be implemented in the design of such artificial agents/systems.

General HRI research[edit]

HRI research spans a wide range of fields, some general to the nature of HRI.

Methods for perceiving humans[edit]

Methods for perceiving humans in the environment are based on sensor information. Research on sensing components and software led by Microsoft provide useful results for extracting the human kinematics (see Kinect). An example of older technique is to use colour information for example the fact that for light skinned people the hands are lighter than the clothes worn. In any case a human modelled a priori can then be fitted to the sensor data. The robot builds or has (depending on the level of autonomy the robot has) a 3D mapping of its surroundings to which is assigned the humans locations.

Most methods intend to build a 3D model through vision of the environment. The proprioception sensors permit the robot to have information over its own state. This information is relative to a reference.

A speech recognition system is used to interpret human desires or commands. By combining the information inferred by proprioception, sensor and speech the human position and state (standing, seated). In this matter, Natural language processing is concerned with the interactions between computers and human (natural) languages, in particular how to program computers to process and analyze large amounts of natural language data. For instance, neural network architectures and learning algorithms that can be applied to various natural language processing tasks including part-of-speech tagging, chunking, named entity recognition, and semantic role labeling.[5]

Methods for motion planning[edit]

Motion planning in dynamic environment is a challenge that is for the moment only achieved for 3 to 10 degrees of freedom robots. Humanoid robots or even 2 armed robots that can have up to 40 degrees of freedom are unsuited for dynamic environments with today's technology. However lower-dimensional robots can use potential field method to compute trajectories avoiding collisions with human.

Cognitive models and theory of mind[edit]

Humans exhibit negative social and emotional responses as well as decreased trust toward some robots that closely, but imperfectly, resemble humans; this phenomenon has been termed the "Uncanny Valley."[6] However recent research in telepresence robots has established that mimicking human body postures and expressive gestures has made the robots likeable and engaging in a remote setting.[7] Further, the presence of a human operator was felt more strongly when tested with an android or humanoid telepresence robot than with normal video communication through a monitor.[8]

While there is a growing body of research about users' perceptions and emotions towards robots, we are still far from a complete understanding. Only additional experiments will determine a more precise model.

Based on past research, we have some indications about current user sentiment and behavior around robots:[9][10]

  • During initial interactions, people are more uncertain, anticipate less social presence, and have fewer positive feelings when thinking about interacting with robots, and prefer to communicate with a human. This finding has been called the human-to-human interaction script.
  • It has been observed that when the robot performs a proactive behaviour and does not respect a "safety distance" (by penetrating the user space) the user sometimes expresses fear. This fear response is person-dependent.
  • It has also been shown that when a robot has no particular use, negative feelings are often expressed. The robot is perceived as useless and its presence becomes annoying.
  • People have also been shown to attribute personality characteristics to the robot that were not implemented in software.

Methods for human-robot coordination[edit]

A large body of work in the field of human-robot interaction has looked at how humans and robots may better collaborate. The primary social cue for humans while collaborating is the shared perception of an activity, to this end researchers have investigated anticipatory robot control through various methods including: monitoring the behaviors of human partners using eye tracking, making inferences about human task intent, and proactive action on the part of the robot.[11] The studies revealed that the anticipatory control helped users perform tasks faster than with reactive control alone.

A common approach to program social cues into robots is to first study human-human behaviors and then transfer the learning. For example, coordination mechanisms in human-robot collaboration[12] are based on work in neuroscience[13] which examined how to enable joint action in human-human configuration by studying perception and action in a social context rather than in isolation. These studies have revealed that maintaining a shared representation of the task is crucial for accomplishing tasks in groups. For example, the authors have examined the task of driving together by separating responsibilities of acceleration and braking i.e., one person is responsible for accelerating and the other for braking; the study revealed that pairs reached the same level of performance as individuals only when they received feedback about the timing of each other's actions. Similarly, researchers have studied the aspect of human-human handovers with household scenarios like passing dining plates in order to enable an adaptive control of the same in human-robot handovers.[14] Most recently, researchers have studied a system that automatically distributes assembly tasks among co-located workers to improve co-ordination.[15]

Application Areas[edit]

In addition to general HRI research, researchers are currently exploring application areas for human-robot interaction systems. Application-oriented research is used to help bring current robotics technologies to bear against problems that exist in today's society. While human-robot interaction is still a rather young area of interest, there is active development and research in many areas.[citation needed]


Assistive robots are broadly defined as a helper or support for human users. Research on assistive robots includes rehabilitation robots, wheelchair robots and walking robots, companion robots, and educational robots.[16] These robots are widely used in a variety of settings, including schools, hospitals and homes. In the past, assistive robots mainly helped people through physical interaction. In recent years, the definition of assistive robots has gradually expanded. In addition to physical assistance, providing help in a non-contact social interaction has become an emerging direction.[citation needed]

Some researchers are also exploring how robots can be used to improve people's education, especially for children. In some specific areas, robots have proven to be more suitable for teaching than people.[17] Educational robots are used to teach social skills to children with autism (ASD). For children with ASD, robots are a more natural social partner than people. The results of the study show that interaction with the robot will allow children with ASD to accept conversations, dances, and play social activities more quickly. Expression of emotion is thought to be important during interaction between human and robots.[18] Some robots convey emotion by changing the surface texture.[19] And some robots can convey emotion in an intuitive and comfortable way with well-designed appearance and behaviors.[20]

Search and rescue[edit]

City Search and Rescue(USAR) is getting more and more important in HRI field. For example, using robots for rescue and recovery work after the collapse of the World Trade Center building.[21] In search and rescue operations, humans play the role of a supervisor, and robots are operators, such as humans giving instructions to search or remove heavy objects, and robots are responsible for executing these commands. The presence of robots greatly reduces the rescue risk of rescue workers. And the efficiency of cooperation between humans and robots is crucial to the success rate of search and rescue. Through groundbreaking government and academic efforts, USAR has developed into one of the most important areas of HRI.[citation needed]

Military and police[edit]

In military and police, human and robotic interactions also play a vital role. In military and police, humans act as commanders, and robots are responsible for information acquisition, bomb demolition and other tactical tasks.[22] The involvement of robots greatly enhances the ability to acquire information and the ability to counter-reconnaissance while performing tasks, thereby minimizing the risk of soldiers. In this process, the interaction between humans and robots has a great impact on the battlefield situation.[citation needed]

Other application areas include:

  • Space exploration
  • Field robotics
  • Home and companion robotics
  • Hospitality
  • Rehabilitation and Elder Care
  • Robot Assisted Therapy (RAT)
  • UAV Reconnaissance and UUV Applications

See also[edit]





Bartneck and Okada[23] suggest that a robotic user interface can be described by the following four properties:

Tool – toy scale
  • Is the system designed to solve a problem effectively or is it just for entertainment?
Remote control – autonomous scale
  • Does the robot require remote control or is it capable of action without direct human influence?
Reactive – dialogue scale
  • Does the robot rely on a fixed interaction pattern or is it able to have dialogue — exchange of information — with a human?
Anthropomorphism scale
  • Does it have the shape or properties of a human?


International Conference on Social Robotics[edit]

The International Conference on Social Robotics is a conference for scientists, researchers, and practitioners to report and discuss the latest progress of their forefront research and findings in social robotics, as well as interactions with human beings and integration into our society.

  • ICSR2009, Incheon, Korea in collaboration with the FIRA RoboWorld Congress
  • ICSR2010, Singapore
  • ICSR2011, Amsterdam, Netherlands

International Conference on Human-Robot Personal Relationships[edit]

International Symposium on New Frontiers in Human-Robot Interaction[edit]

This symposium is organized in collaboration with the Annual Convention of the Society for the Study of Artificial Intelligence and Simulation of Behaviour.

  • 2015, Canterbury, United Kingdom
  • 2014, London, United Kingdom
  • 2010, Leicester, United Kingdom
  • 2009, Edinburgh, United Kingdom

IEEE International Symposium in Robot and Human Interactive Communication[edit]

The IEEE International Symposium on Robot and Human Interactive Communication ( RO-MAN ) was founded in 1992 by Profs. Toshio Fukuda, Hisato Kobayashi, Hiroshi Harashima and Fumio Hara. Early workshop participants were mostly Japanese, and the first seven workshops were held in Japan. Since 1999, workshops have been held in Europe and the United States as well as Japan, and participation has been of international scope.

ACM/IEEE International Conference on Human-Robot Interaction[edit]

This conference is amongst the best conferences in the field of HRI and has a very selective reviewing process. The average acceptance rate is 26% and the average attendance is 187. Around 65% of the contributions to the conference come from the US and the high level of quality of the submissions to the conference becomes visible by the average of 10 citations that the HRI papers attracted so far.[24]

  • HRI 2006 in Salt Lake City, Utah, USA, Acceptance Rate: 0.29
  • HRI 2007 in Washington, D.C., USA, Acceptance Rate: 0.23
  • HRI 2008 in Amsterdam, Netherlands, Acceptance Rate: 0.36 (0.18 for oral presentations)
  • HRI 2009 in San Diego, CA, USA, Acceptance Rate: 0.19
  • HRI 2010 in Osaka, Japan, Acceptance Rate: 0.21
  • HRI 2011 in Lausanne, Switzerland, Acceptance Rate: 0.22 for full papers
  • HRI 2012 in Boston, Massachusetts, USA, Acceptance Rate: 0.25 for full papers
  • HRI 2013 in Tokyo, Japan, Acceptance Rate: 0.24 for full papers
  • HRI 2014 in Bielefeld, Germany, Acceptance Rate: 0.24 for full papers
  • HRI 2015 in Portland, Oregon, USA, Acceptance Rate: 0.25 for full papers
  • HRI 2016 in Christchurch, New Zealand, Acceptance Rate: 0.25 for full papers
  • HRI 2017 in Vienna, Austria, Acceptance Rate: 0.24 for full papers
  • HRI 2018 in Chicago, USA, Acceptance Rate: 0.24 for full papers

International Conference on Human-Agent Interaction[edit]

Related conferences[edit]

There are many conferences that are not exclusively HRI, but deal with broad aspects of HRI, and often have HRI papers presented.

  • IEEE-RAS/RSJ International Conference on Humanoid Robots (Humanoids)
  • Ubiquitous Computing (UbiComp)
  • IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
  • Intelligent User Interfaces (IUI)
  • Computer Human Interaction (CHI)
  • American Association for Artificial Intelligence (AAAI)

Related journals[edit]

There are currently two dedicated HRI Journals

  • International Journal of Social Robotics
  • The open access Journal of Human-Robot Interaction

and there are several more general journals in which one will find HRI articles.


  1. ^ Dautenhahn, Kerstin (29 April 2007). "Socially intelligent robots: dimensions of human–robot interaction". Philosophical Transactions of the Royal Society B: Biological Sciences. 362 (1480): 679–704. doi:10.1098/rstb.2006.2004. PMC 2346526. PMID 17301026.
  2. ^ Sauppé, Allison; Mutlu, Bilge (2015). "The Social Impact of a Robot Co-Worker in Industrial Settings". Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems - CHI '15. pp. 3613–3622. doi:10.1145/2702123.2702181. ISBN 978-1-4503-3145-6.
  3. ^ Human-Robot Interaction.
  4. ^ Bubaš, Goran; Lovrenčić, Alen (2002). Implications of interpersonal communication competence research on the design of artificial behavioral systems that interact with humans. Proceedings of the 6th International Conference on Intelligent Engineering Systems - INES 2002.
  5. ^ Collobert, Ronan; Weston, Jason; Bottou, Léon; Karlen, Michael; Kavukcuoglu, Koray; Kuksa, Pavel (2011). Natural Language Processing (Almost) from Scratch. OCLC 963993063.
  6. ^ Mathur, Maya B.; Reichling, David B. (2016). "Navigating a social world with robot partners: a quantitative cartography of the Uncanny Valley". Cognition. 146: 22–32. doi:10.1016/j.cognition.2015.09.008. PMID 26402646.
  7. ^ Adalgeirsson, Sigurdur; Breazeal, Cynthia (2010). MeBot: A Robotic Platform for Socially Embodied Presence (pdf). Hri '10. pp. 15–22. ISBN 9781424448937.
  8. ^ Sakamoto, Daisuke; Kanda, Takayuki; Ono, Tetsuo; Ishiguro, Hiroshi; Hagita, Norihiro (2007). "Android as a telecommunication medium with a human-like presence". Proceeding of the ACM/IEEE international conference on Human-robot interaction - HRI '07. p. 193. doi:10.1145/1228716.1228743. ISBN 978-1-59593-617-2.
  9. ^ Spence, Patric R.; Westerman, David; Edwards, Chad; Edwards, Autumn (July 2014). "Welcoming Our Robot Overlords: Initial Expectations About Interaction With a Robot". Communication Research Reports. 31 (3): 272–280. doi:10.1080/08824096.2014.924337.
  10. ^ Edwards, Chad; Edwards, Autumn; Spence, Patric R.; Westerman, David (21 December 2015). "Initial Interaction Expectations with Robots: Testing the Human-To-Human Interaction Script". Communication Studies. 67 (2): 227–238. doi:10.1080/10510974.2015.1121899.
  11. ^ Anticipatory Robot Control for Efficient Human-Robot Collaboration (pdf). Hri '16. 2016. pp. 83–90. ISBN 9781467383707.
  12. ^ Coordination mechanisms in human-robot collaboration. Proceedings of the ACM/IEEE International Conference on Human-robot Interaction. 2013. CiteSeerX
  13. ^ Sebanz, Natalie; Bekkering, Harold; Knoblich, Günther (February 2006). "Joint action: bodies and minds moving together". Trends in Cognitive Sciences. 10 (2): 70–76. doi:10.1016/j.tics.2005.12.009. PMID 16406326.
  14. ^ Huang, Chien-Ming; Cakmak, Maya; Mutlu, Bilge (2015). Adaptive Coordination Strategies for Human-Robot Handovers (PDF). Robotics: Science and Systems.
  15. ^ "WeBuild: Automatically Distributing Assembly Tasks Among Collocated Workers to Improve Coordination" (PDF). 2017. Cite journal requires |journal= (help)
  16. ^ Bemelmans, Roger; Gelderblom, Gert Jan; Jonker, Pieter; de Witte, Luc (February 2012). "Socially Assistive Robots in Elderly Care: A Systematic Review into Effects and Effectiveness". Journal of the American Medical Directors Association. 13 (2): 114–120.e1. doi:10.1016/j.jamda.2010.10.002. PMID 21450215.
  17. ^ Yeo, Song Huat; Chen, I.-Ming; Tzuo, Pei-Wen; Causo, Albert; Toh, Lai Poh Emily (2016). "A Review on the Use of Robots in Education and Young Children". Journal of Educational Technology & Society. 19 (2): 148–63. hdl:10220/42422.
  18. ^ Miwa, H.; Itoh, K.; Matsumoto, M.; Zecca, M.; Takariobu, H.; Roccella, S.; Carrozza, M.C.; Dario, P.; Takanishi, A. (2004). "Effective emotional expressions with emotion expression humanoid robot WE-4RII". 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566). 3. pp. 2203–2208. doi:10.1109/IROS.2004.1389736. ISBN 978-0-7803-8463-7.
  19. ^ Hu, Yuhan; Zhao, Zhengnan; Vimal, Abheek; Hoffman, Guy (2018). "Soft skin texture modulation for social robotics". 2018 IEEE International Conference on Soft Robotics (RoboSoft). pp. 182–187. doi:10.1109/ROBOSOFT.2018.8404917. ISBN 978-1-5386-4516-1.
  20. ^ Kozima, Hideki; Michalowski, Marek P.; Nakagawa, Cocoro (19 November 2008). "Keepon". International Journal of Social Robotics. 1 (1): 3–18. doi:10.1007/s12369-008-0009-8.
  21. ^ Casper, J.; Murphy, R.R. (June 2003). "Human-robot interactions during the robot-assisted urban search and rescue response at the World Trade Center". IEEE Transactions on Systems, Man, and Cybernetics - Part B: Cybernetics. 33 (3): 367–385. doi:10.1109/TSMCB.2003.811794. PMID 18238185.
  22. ^ Royakkers, Lambèr; van Est, Rinie (9 April 2015). "A Literature Review on New Robotics: Automation from Love to War". International Journal of Social Robotics. 7 (5): 549–570. doi:10.1007/s12369-015-0295-x.
  23. ^ Bartneck, Christoph; Michio Okada (2001). "Robotic User Interfaces" (PDF). Proceedings of the Human and Computer Conference. pp. 130–140.
  24. ^ Bartneck, Christoph (February 2011). "The end of the beginning: a reflection on the first five years of the HRI conference". Scientometrics. 86 (2): 487–504. doi:10.1007/s11192-010-0281-x. PMC 3016230. PMID 21297856.


External links[edit]