Virtual Human Interaction Lab

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Stanford Virtual Human Interaction Lab
Established 2003
Field of research
Virtual Reality
Director Jeremy Bailenson
Address McClatchy Hall, Building 120
450 Serra Mall
Stanford, CA 94309
USA
Location Stanford, CA USA
Nickname VHIL
Operating agency
Stanford University
Website Official Website

The Virtual Human Interaction Lab (VHIL) at Stanford University was founded in 2003 by Professor Jeremy Bailenson, associate professor of communication at Stanford University. The lab conducts research for the Communication Department. VHIL's mission statement includes: "The mission of the Virtual Human Interaction Lab is to understand the dynamics and implications of interactions among people in immersive virtual reality simulations (VR), and other forms of human digital representations in media, communication systems, and games. Researchers in the lab are most concerned with understanding the social interaction that occurs within the confines of VR, and the majority of our work is centered on using empirical, behavioral science methodologies to explore people as they interact in these digital worlds. However, oftentimes it is necessary to develop new gesture tracking systems, three-dimensional modeling techniques, or agent-behavior algorithms in order to answer these basic social questions. Consequently, we also engage in research geared towards developing new ways to produce these VR simulations."

Faculty and research staff[edit]

  • Jeremy Bailenson, professor of communication, VHIL Founder
  • Shawnee Baughman, Lab Manager, B.S. and M.S. in communication at Stanford University

Current research[edit]

Digital anonymity[edit]

Digital media, and avatars more specifically, have made it increasingly easy for users to interact anonymously. In digital worlds our avatars may differ from our physical world selves on a variety of characteristics ranging from name and physical appearance to demographics and attitudes. We are studying how digital media users who anonymize themselves via their avatars may be perceived differently from media users who use avatars that resemble their physical world selves. We are asking questions such as, is ostracism more aversive when it comes from an anonymous or identified digital media user? And, are media users who choose to be anonymous treated differently from media users who are merely assigned anonymous avatars?

Mediators and mimicry[edit]

A mediator's success hinges on two important factors: impartiality and rapport. Ironically, the process of establishing rapport can undermine the mediator's ability to convey a sense of impartiality. Thus, mediators face a dilemma – a dilemma that we believe digital media might be able to help solve. We are how exploring the affordances of online dispute resolution (ODR) may help mediators strike a delicate balance between developing rapport and maintaining impartiality. One area that is of particular interest to us is digital mimicry. Mimicry is known to elicit a wide variety of favorable responses; using tracking technology and computer algorithms we can make virtual mediators subtly yet perfectly mimic disputants' head movements.

Out-of-body experience[edit]

What if the virtual self could "feel" in a virtual world the same way the physical self can feel in the physical world? Navigating virtual 3D environments, performing remote surgery, and tanning on a virtual island would become second-nature at this level of full immersion. We are studying ways to create and measure this phenomenon, known as self-presence, or an out-of-body experience. Current questions we are asking in this research area include what stimuli are necessary to induce digital body ownership and what modifications of avatars and virtual environments increase self-presence.

Augmented perspective taking[edit]

Perspective taking is the ability to mentally put oneself in the shoes of another to imagine what the other person might be thinking and feeling in a certain situation. Immersive virtual environments allow people to vividly share the perceptual experiences of others as if they are in the heat of the moment. In essence, our abilities to take the perspective of another person can be augmented by viscerally sharing their experiences - seeing, hearing, and feeling what the other person did in a particular situation. We can now literally climb into the skin of the other person to fully embody their body and senses. Current projects explore how novel affordances of interactive digital media such as immersion and interactivity can enhance the ability to understand other minds and how the virtual experience can influence our attitudes and behaviors.

Self-endorsing[edit]

Self-endorsing is a novel persuasion strategy made possible by the advancement of interactive digital media. The self is no longer just a passive receiver of information, but can simultaneously partake in the formation and dispersion of persuasive messages, persuading the self with the self. What may have sounded like a topic of a futuristic science fiction movie can now be easily and rapidly done using simple graphics software. Tapping into the framework of self-referencing, research on self-endorsing explores how using the self as the source of persuasive messages can powerfully influence attitudes and behaviors in various persuasive contexts.

Automatic facial feature detection and analyses[edit]

While most prior research on facial expressions involve some form of manual coding by human coders based on established facial coding systems (e.g., FACS), this methodology uses just a small webcam and computer software to predict an individual's errors and performance quality based only on facial features that are tracked and logged automatically. Using just the first five to seven minutes of facial feature data, researchers were able to predict a participant's performance on a 30-minute experimental task with up to 90% accuracy. There are countless applications for this methodology that would facilitate research of other media effects. For instance, this methodology can predict purchasing decisions based on facial expressions (e.g., "buying" face vs. "not-buying" face) while participants engage in an online shopping task. Researchers can also monitor emotional fluctuations in real time as people make their selection of media content and verify whether or not the choices are contributing toward maintaining a good mood (i.e., mood management theory; Zillmann) based on their facial expressions. In addition, advertisers could benefit by receiving real-time data on the participant's responses to advertisements. Automatic facial feature analysis is not yet a perfect 'looking glass' to a person's mind, but its advantages are obvious and promising.

Past research[edit]

Proteus effect[edit]

Researchers discovered that by allowing a subject to use an avatar of varying attractiveness or height, this affected how they acted in a virtual environment. They adapted to the role they felt their avatar played.

Transformed social interaction[edit]

The phenomenon of transformed social interaction hopes to explore what occurs when behaviors that take place in collaborative virtual environments are augmented or decremented. The lab's hope is to see how permitting commonly impossible behaviors in virtual environments alters and ultimately enhances the way that people perform in learning and business meetings.

Facial Identity Capture and Presidential Candidate Preference[edit]

Through this line of research, it was found that by morphing a subject's face in a 40:60 ratio with that of John Kerry and George W. Bush, the subject was more likely to prefer the candidate that shared their features. This study has implications concerning the use of a voter's image and overall face morphing during national elections to sway a voter's decision.

Virtual aging's affect on financial decisions[edit]

Researchers found that when subjects were presented with digital, older versions of themselves they subsequently adapted their spending behavior to save more for the future.

Eye witness testimony and virtual police lineups[edit]

In collaboration with the Research Center for Virtual Environments and Behavior, the National Science Foundation, and the Federal Judicial Center, VHIL examined the capabilities of pointing out witnesses during a police lineup while in a virtual environment. VR gives witnesses the opportunities to examine in a 3D environment, at different distances and even gives them the opportunity to examine the suspect at the recreated scene of the crime.

Diversity simulation[edit]

Using virtual reality allows people to truly experience the proverbial "walk a mile" in someone else's shoes. By allowing participants to experience another race or gender, researchers at VHIL hoped to raise awareness about ongoing issues with diversity.

References[edit]

[1] [2] [3] [4] [5] [6] [7] [8] [9] [10] [11] [12] [13] [14] [15]

  1. ^ Fox, J. & Bailenson, J.N. (2010). The use of doppelgängers to promote health behavior change. CyberTherapy & Rehabilitation, 3 (2), 16-17.
  2. ^ Leonetti, C., & Bailenson, J.N. (2010). High-Tech view: The use of immersive virtual environments in jury trials. 93 (3) Marquette Law Review, 1073.
  3. ^ Bailenson, J.N. & Segovia, K.Y. (2010). Virtual doppelgangers: Psychological effects of avatars who ignore their owners. In W. S. Bainbridge (Ed.), Online worlds: Convergence of the real and the virtual (175-186). Springer: New York.
  4. ^ Segovia, K.Y. & Bailenson, J.N. (2009). Virtually true: Children's acquisition of false memories in virtual reality. Media Psychology, 12, 371-393.
  5. ^ Fox, J., Bailenson, J.N., & Binney, J. (2009). Virtual experiences, physical behaviors: The effect of presence on imitation of an eating avatar. PRESENCE: Teleoperators & Virtual Environments, 18(4), 294-303.
  6. ^ Yee, N., Bailenson, J.N., & Ducheneaut, N. (2009). The Proteus Effect: Implications of transformed digital self-representation on online and offline behavior. Communication Research, 36 (2), 285-312.
  7. ^ Fox, J., & Bailenson, J.N. (2009). Virtual virgins and vamps: The effects of exposure to female characters' sexualized appearance and gaze in an immersive virtual environment. Sex Roles, 61 (3-4), 147-157.
  8. ^ Groom, V., Bailenson, J.N., & Nass, C. (2009). The influence of racial embodiment on racial bias in immersive virtual environments. Social Influence, 4(1), 1-18.
  9. ^ Bailenson, J.N., Iyengar, S., Yee, N., & Collins, N. (2008). Facial similarity between voters and candidates causes influence. Public Opinion Quarterly, 72 (5), 935-961.
  10. ^ Ersner-Hershfield, H., Bailenson, J., & Carstensen, L.L. (2008). Feeling more connected to your future self: Using immersive virtual reality to increase retirement saving. Poster presented at the Association for Psychological Science Annual Convention, Chicago, IL.
  11. ^ Yee, N., Bailenson, J.N. (2008). A method for longitudinal behavioral data collection in Second Life, PRESENCE: Teleoperators and Virtual Environments. 17(6), 594-596.
  12. ^ Bailenson, J.N., Davies, A., Beall. A.C., Blascovich, J., Guadagno, R.E., & McCall, C. (2008). The effects of witness viewpoint distance, angle, and choice on eyewitness accuracy in police lineups conducted in immersive virtual environments. PRESENCE: Teleoperators and Virtual Environments, 17(3), 242-255.
  13. ^ Yee, N., Bailenson, J.N., Urbanek, M., Chang, F., & Merget, D. (2007). The unbearable likeness of being digital; The persistence of nonverbal social norms in online virtual environments. Cyberpsychology and Behavior, 10, 115-121.
  14. ^ Bailenson, J.N. (2006). Transformed social interaction in collaborative virtual environments. In Messaris, P. and Humphreys, L. (Ed.) Digital Media: Transformations in Human Communication. 255-264. New York: Peter Lang.
  15. ^ Yee, N., & Bailenson, J.N. (2006). Walk a mile in digital shoes: The impact of embodied perspective-taking on the reduction of negative stereotyping in immersive virtual environments. Proceedings of PRESENCE 2006: The 9th Annual International Workshop on Presence. August 24 – 26, Cleveland, Ohio, USA.

External links[edit]