Computers are social actors

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

Computers Are Social Actors (CASA) is a paradigm which states humans mindlessly apply the same social heuristics used for human interactions to computers because computers call to mind similar social attributes as humans.[1][2][3]

History and context[edit]

Nass and Moon's article, "Machines and Mindlessness: Social Responses to Computers", published in 2000 is the origin for CASA. In the article they state that CASA is the concept that people mindlessly apply social rules and expectations to computers even though they know that these machines do not have feelings, intentions or human motivations.

In their 2000 article, Nass and Moon attribute their observation of anthropocentric reactions to computers and previous research on mindlessness as factors that lead them to study the phenomenon of computers as social actors. Specifically, they observed consistent anthropocentric treatment of computers by individuals in natural and lab settings, even though these individuals agreed – computers are not human and shouldn't be treated like a human. Additionally, Nass and Moon found a similarity between this behavior and Harvard professor of psychology, Ellen Langer's, research on mindlessness. Langer states that mindlessness is when a specific context triggers an individual to rely on categories, associations, and habits of thought from the past with little to no conscious awareness. When these contexts are triggered, the individual becomes oblivious to novel or alternative aspects of the situation. In this respect, mindlessness is similar to habits and routines, but different in that with only one exposure to information, a person will create a cognitive commitment to the information and freeze its potential meaning. With mindlessness, alternative meanings or uses of the information become unavailable for active cognitive use.[4][5]

Social attributes that computers have which are similar to humans include:

  • Words for output
  • Interactivity (the computer 'responds' when you touch a button)
  • Performs traditional human roles (i.e. Griffin instead of a librarian[clarification needed])

According to CASA, the above attributes trigger scripts for human-human interaction, which leads an individual to ignore cues revealing the asocial nature of a computer. Although individuals using computers exhibit a mindless social response to the compute, individuals who are sensitive to the situation can observe the inappropriateness of the cued social behaviors.[6]


Cued social behaviors observed in research settings include some of the following:

  • Gender stereotyping: When voice outputs are used on computers, this triggers mindless gender stereotyped scripts, expectations, and attributions from individuals. For example, a 1997 study revealed that female-voiced tutor computers were rated as more informative about love and relationships than male-voiced tutors and male-voiced computers were more proficient in technical subjects than female-voiced computer[7]
  • Reciprocity: When a computer provides help, favors, or benefits, this triggers the mindless response of the participant feeling obliged to 'help' the computer. For example, an experiment in 1997 found that when a specific computer 'helped' a person, that person was more likely to do more 'work' for that computer[8]
  • Specialist versus generalist: When a technology is labeled as 'specialist', this triggers a mindless response by influencing people's perceptions of the content the labeled technology presents. For example, a 2000 study revealed when people watched a television labeled 'News Television' they thought the news segments on that TV were higher in quality, had more information, and were more interesting than people who saw the identical information on a TV labeled 'News and Entertainment Television'.[9]
  • Personality personification: When a computer user mindlessly creates a personality for a computer based on verbal or paraverbal cues in the interface. For example, research from 1996 and 2001 found people with dominant personalities preferred computers that also had a 'dominant personality'; that is, the computer used strong, assertive language during tasks.[10][11]

Academic research[edit]

Three research articles have represented some of the advances in the field of CASA. Specifically, researchers in this field are looking at how novel variables, manipulations, and new computer software influence mindlessness.

A 2010 article, "Cognitive load on social response to computers" by E.J. Lee discussed research on how human likeness of a computer interface, individuals' rationality, and cognitive busyness moderate the extent to which people apply social attributes to computers. The research revealed that participants were more socially attracted to the computer that flattered them than generic-comment computer, but they became more suspicious about the validity of the flattery computer's claims and more likely to dismiss its answer. These negative effects disappeared when participants simultaneously engaged in a secondary task.[12]

A 2011 study, "Computer emotion – impacts on trust" by Antos, De Melo, Gratch, and Grosz investigated whether computer agents can use the expression of emotion to influence human perceptions of trustworthiness in the context of a negotiation game followed by a trust game. They found computer agents that displayed emotion congruent with their actions were preferred as partners in the trust game over computer agents whose emotion expressions and actions did not match. They also found that when emotion does not carry useful new information, it did not strongly influence human decision-making behavior in a negotiation setting.[13]

A 2011 study "Cloud computing – reexamination of CASA" by Hong and Sundar found when people are in a cloud computing environment (cloud programs run via internet, not the computer's hard drive), they shift their source orientation – that is, users evaluate the system by focusing on service providers over the internet, not directly on the machines in front of them. Hong and Sundar conclude their study by stating, "if individuals no longer respond socially to computers in clouds, there will need to be a fundamental re-examination of the mindless social response of humans to computers."[14]

One example of how CASA research can impact consumer behavior and attitude is Moon's experiment (2000) which tested the application of the principle of reciprocity and disclosure in a consumer context. He tested this principle with intimate self-disclosure of high-risk information (when disclosure makes the person feel vulnerable) to a computer and observed how that disclosure affects future attitudes and behaviors. Participants interacted with a computer which questioned them using reciprocal wording and gradual revealing of intimate information, then participants did a puzzle on paper, and finally half the group went back to the same computer and the other half went to a different computer. Both groups were shown 20 products and asked if they would purchase them. Participants who used the same computer throughout the experiment had a higher purchase likelihood score and a higher attraction score toward the computer in the product presentation than participants who did not use the same computer throughout the experiment.[15]


  1. ^ Nass, C. and Y. Moon, Machines and Mindlessness: Social Responses to Computers. Journal of Social Issues, 2000. 56(1).
  2. ^ Reeves, B. and C.I. Nass, The media equation : how people treat computers, television, and new media like real people and places. 1996, Stanford, Calif.; New York: CSLI Publications ; Cambridge University Press.
  3. ^ Nass, C.I. and S. Brave, Wired for speech : how voice activates and advances the human-computer relationship. 2005, Cambridge, Mass.: MIT Press.
  4. ^ Langer, E., Matters of mind: Mindfulness/mindlessness in perspective. Consciousness and Cognition Consciousness and Cognition, 1992. 1(3): p. 289-305.
  5. ^ Langer, E.J., Mindfulness. 1989, Reading, Mass.: Addison-Wesley Pub. Co.
  6. ^ Nass, C. and Y. Moon, Machines and Mindlessness: Social Responses to Computers. Journal of Social Issues, 2000. 56(1).
  7. ^ Nass, C., Y. Moon, and N. Green, Are machines gender neutral? Gender-stereotypic responses to computers with voices. J Appl Soc Psychol Journal of Applied Social Psychology, 1997. 27: p. 864-76.
  8. ^ Fogg, B.J., & Nass, C. I., How users reciprocate to computers: An experiment that demonstrates behavior change. CHI Extended Abstract. 1997, New York: ACM Press.
  9. ^ Nass, C. and Y. Moon, Machines and Mindlessness: Social Responses to Computers. Journal of Social Issues, 2000. 56(1).
  10. ^ Nass, C. and K.M. Lee, Does computer-synthesized speech manifest personality? Experimental tests of recognition, similarity-attraction, and consistency-attraction. Journal of experimental psychology. Applied, 2001. 7(3): p. 171-81.
  11. ^ Moon, Y. and C. Nass, How "Real" Are Computer Personalities? Psychological Responses to Personality Types in Human-Computer Interaction. Communication Research, 1996. 23(6): p. 651-74.
  12. ^ Lee, E.-J., What Triggers Social Responses to Flattering Computers? Experimental Tests of Anthropomorphism and Mindlessness Explanations. Commun Res Communication Research, 2010. 37(2): p. 191-214.
  13. ^ Antos, D., De Melo, C., Gratch, J., & Grosz, B. The Influence of Emotion Expression on Perceptions of Trustworthiness in Negotiation. in Twenty-Fifth AAAI Conference on Artificial Intelligence. 2011. San Francisco: Association for the Advancement of Artificial Intelligence.
  14. ^ Hong, S., and Sundar, S. S., Social Responses to Computers in Cloud Computing Environment: The Importance of Source Orientation. ACM, 2011. CHI 2011, May 7–12, 2011, Vancouver, BC, Canada.
  15. ^ Moon, Y., Intimate exchanges: using computers to elicit self-disclosure from consumers. Communication Abstracts, 2000. 23(5).