The intentional stance is a term coined by philosopher Daniel Dennett for the level of abstraction in which we view the behavior of a thing in terms of mental properties. It is part of a theory of mental content proposed by Dennett, which provides the underpinnings of his later works on free will, consciousness, folk psychology, and evolution.
Here is how it works: first you decide to treat the object whose behavior is to be predicted as a rational agent; then you figure out what beliefs that agent ought to have, given its place in the world and its purpose. Then you figure out what desires it ought to have, on the same considerations, and finally you predict that this rational agent will act to further its goals in the light of its beliefs. A little practical reasoning from the chosen set of beliefs and desires will in most instances yield a decision about what the agent ought to do; that is what you predict the agent will do.—Daniel Dennett, The Intentional Stance, p. 17
Dennett's three levels
The core idea is that, when explaining and predicting the behavior of an object, we can choose to view it at varying levels of abstraction. The more concrete the level, the more accurate in principle our predictions are; the more abstract, the greater the computational power we gain by zooming out and skipping over the irrelevant details.
Dennett defines three levels of abstraction:
- The most concrete is the physical stance, which is the domain of physics and chemistry. At this level, we are concerned with such things as mass, energy, velocity, and chemical composition. When we predict where a ball is going to land based on its current trajectory, we are taking the physical stance. Another example of this stance comes when we look at a strip made up of two types of metal bonded together and predict how it will bend as the temperature changes, based on the physical properties of the two metals.
- Somewhat more abstract is the design stance, which is the domain of biology and engineering. At this level, we are concerned with such things as purpose, function and design. When we predict that a bird will fly when it flaps its wings on the basis that wings are made for flying, we are taking the design stance. Likewise, we can understand the bimetallic strip as a particular type of thermometer, not concerning ourselves with the details of how this type of thermometer happens to work. We can also recognize the purpose that this thermometer serves inside a thermostat and even generalize to other kinds of thermostats that might use a different sort of thermometer. We can even explain the thermostat in terms of what it's good for, saying that it keeps track of the temperature and turns on the heater whenever it gets below a minimum, turning it off once it reaches a maximum.
- Most abstract is the intentional stance, which is the domain of software and minds. At this level, we are concerned with such things as belief, thinking and intent. When we predict that the bird will fly away because it knows the cat is coming and is afraid of getting eaten, we are taking the intentional stance. Another example would be when we predict that Mary will leave the theater and drive to the restaurant because she sees that the movie is over and is hungry.
A key point is that switching to a higher level of abstraction has its risks as well as its benefits. For example, when we view both a bimetallic strip and a tube of mercury as thermometers, we can lose track of the fact that they differ in accuracy and temperature range, leading to false predictions as soon as the thermometer is used outside the circumstances for which it was designed. The actions of a mercury thermometer heated to 500 °C can no longer be predicted on the basis of treating it as a thermometer; we have to sink down to the physical stance to understand it as a melted and boiled piece of junk. For that matter, the "actions" of a dead bird are not predictable in terms of beliefs or desires.
Even when there is no immediate error, a higher-level stance can simply fail to be useful. If we were to try to understand the thermostat at the level of the intentional stance, ascribing to it beliefs about how hot it is and a desire to keep the temperature just right, we would gain no traction over the problem as compared to staying at the design stance, but we would generate theoretical commitments that expose us to absurdities, such as the possibility of the thermostat not being in the mood to work today because the weather is so nice. Whether to take a particular stance, then, is determined by how successful that stance is when applied.
Dennett argues that it is best to understand human behavior at the level of the intentional stance, without making any specific commitments to any deeper reality of the artifacts of folk psychology. In addition to the controversy inherent in this, there is also some dispute about the extent to which Dennett is committing to realism about mental properties. Initially, Dennett's interpretation was seen as leaning more towards instrumentalism, but over the years, as this idea has been used to support more extensive theories of consciousness, it has been taken as being more like Realism. His own words hint at something in the middle, as he suggests that the self is as real as a center of gravity, "an abstract object, a theorist's fiction", but operationally valid.
Objections and replies
The most obvious objection to Dennett is the intuition that it "matters" to us whether an object has an inner life or not. The claim is that we don't just imagine the intentional states of other people in order to predict their behaviour; the fact that they have thoughts and feelings just like we do is central to notions such as trust, friendship and love. The Blockhead argument proposes that someone, Jones, has a twin who is in fact not a person but a very sophisticated robot which looks and acts like Jones in every way, but who (it is claimed) somehow does not have any thoughts or feelings at all, just a chip which controls his behaviour; in other words, "the lights are on but no one's home". According to the intentional systems theory (IST), Jones and the robot have precisely the same beliefs and desires, but this is claimed to be false. The IST expert assigns the same mental states to Blockhead as he does to Jones, "whereas in fact [Blockhead] has not a thought in his head." Dennett has argued against this by denying the premise, on the basis that the robot is a philosophical zombie and therefore metaphysically impossible. In other words, if something acts in all ways conscious, it necessarily is, as consciousness is defined in terms of behavioral capacity, not ineffable qualia.
Another objection attacks the premise that treating people as ideally rational creatures will yield the best predictions. Stephen Stich argues that people often have beliefs or desires which are irrational or bizarre, and IST doesn't allow us to say anything about these. Of course if the person's "environmental niche" is examined closely enough, and the possibility of malfunction in their brain (which might affect their reasoning capacities) is looked into, it may be possible to formulate a predictive strategy specific to that person. Indeed this is what we often do when someone is behaving unpredictably — we look for the reasons why. In other words, we can only deal with irrationality by contrasting it against the background assumption of rationality. This development significantly undermines the claims of the intentional stance argument.
The rationale behind the intentional stance is based on evolutionary theory, particularly the notion that the ability to make quick predictions of a system's behaviour based on what we think it might be thinking was an evolutionary adaptive advantage. The fact that our predictive powers are not perfect is a further result of the advantages sometimes accrued by acting contrary to expectations.
Philip Robbins and Anthony I. Jack suggest that "Dennett's philosophical distinction between the physical and intentional stances has a lot going for it" from the perspective of psychology and neuroscience. They review studies on abilities to adopt an intentional stance (variously called "mindreading," "mentalizing," or "theory of mind") as distinct from adopting a physical stance ("folk physics," "intuitive physics," or "theory of body"). Autism seems to be a deficit in the intentional stance with preservation of the physical stance, while Williams syndrome can involve deficits in the physical stance with preservation of the intentional stance. This tentatively suggests a double dissociation of intentional and physical stances in the brain.
Robbins and Jack point to a 2003 study in which participants viewed animated geometric shapes in different "vignettes," some of which could be interpreted as constituting social interaction, while others suggested mechanical behavior. Viewing social interactions elicited activity in brain regions associated with identifying faces and biological objects (posterior temporal cortex), as well as emotion processing (right amygdala and ventromedial prefrontal cortex). Meanwhile, the mechanical interactions activated regions related to identifying objects like tools that can be manipulated (posterior temporal lobe). The authors suggest "that these findings reveal putative 'core systems' for social and mechanical understanding that are divisible into constituent parts or elements with distinct processing and storage capabilities."
Robbins and Jack argue for an additional stance beyond the three that Dennett outlined. They call it the phenomenal stance: Attributing consciousness, emotions, and inner experience to a mind. The explanatory gap of the hard problem of consciousness illustrates this tendency of people to see phenomenal experience as different from physical processes. The authors suggest that psychopathy may represent a deficit in the phenomenal but not intentional stance, while people with autism appear to have intact moral sensibilities, just not mind-reading abilities. These examples suggest a double dissociation between the intentional and phenomenal stances.
In a follow-up paper, Robbins and Jack describe four experiments about how the intentional and phenomenal stances relate to feelings of moral concern. The first two experiments showed that talking about lobsters as strongly emotional led to a much greater sentiment that lobsters deserved welfare protections than did talking about lobsters as highly intelligent. The third and fourth studies found that perceiving an agent as vulnerable led to greater attributions of phenomenal experience. Also, people who scored higher on the empathetic-concern subscale of the Interpersonal Reactivity Index had generally higher absolute attributions of mental experience.
Bryce Huebner (2010) performed two experimental philosophy studies to test students' ascriptions of various mental states to humans compared with cyborgs and robots. Experiment 1 showed that while students attributed both beliefs and pains most strongly to humans, they were more willing to attribute beliefs than pains to robots and cyborgs.:138 "[T]hese data seem to confirm that commonsense psychology does draw a distinction between phenomenal and non-phenomenal states--and this distinction seems to be dependent on the structural properties of an entity in a way that ascriptions of non-phenomenal states are not.":138–39 However, this conclusion is only tentative in view of the high variance among participants.:139 Experiment 2 showed analogous results: Both beliefs and happiness were ascribed most strongly to biological humans, and ascriptions of happiness to robots or cyborgs were less common than ascriptions of beliefs.:142
- Philosophy of mind
- Philosophical realism
- Marr's levels of analysis
- Conceptual blending
- Dennett, D. C., (1987) "Three Kinds of Intentional Psychology", pp. 43–68 in Dennett, D. C., The Intentional Stance, MIT Press, (Cambridge), 1987.
- Daniel Dennett. "The Self as a Center of Narrative Gravity". Retrieved 2008-07-03.
- Daniel Dennett, The Unimagined Preposterousness of Zombies
- Philip Robbins; Anthony I. Jack (Jan 2006). "The phenomenal stance". Philosophical Studies 127 (1): 59–85. doi:10.1007/s11098-005-1730-x.
- Alex Martin; Jill Weisberg (2003). "Neural Foundations For Understanding Social And Mechanical Concepts". Cogn Neuropsychol 20 (3-6): 575–587. doi:10.1080/02643290342000005. PMC 1450338. PMID 16648880.
- Anthony I. Jack; Philip Robbins (Sep 2012). "The Phenomenal Stance Revisited". Review of Philosophy and Psychology 3 (3): 383–403. doi:10.1007/s13164-012-0104-5.
- Bryce Huebner (Mar 2010). "Commonsense concepts of phenomenal consciousness: Does anyone care about functional zombies?". Phenomenology and the Cognitive Sciences 9 (1): 133–155. doi:10.1007/s11097-009-9126-6.
- Daniel C. Dennett (1996), The Intentional Stance (6th printing), Cambridge, Massachusetts: The MIT Press, ISBN 0-262-54053-3 (First published 1987).
- Daniel C. Dennett (1997), "Chapter 3. True Believers: The Intentional Strategy and Why it Works", in John Haugeland, Mind Design II: Philosophy, Psychology, Artificial Intelligence. Massachusetts: Massachusetts Institute of Technology. ISBN 0-262-08259-4 (first published in Scientific Explanation, 1981, edited by A.F. Heath, Oxford: Oxford University Press; originally presented as a Herbert Spencer lecture at Oxford in November 1979; also published as chapter 2 in Dennett's book The Intentional Stance).
- Dennett, D. "Three kinds of intentional psychology" (IP) in Heil, J. - Philosophy of Mind: A guide and anthology, Clarendon Press, Oxford, 2004
- Braddon-Mitchell, D., & Jackson, F. Philosophy of Mind and Cognition, Basil Blackwell, Oxford, 1996
- Dennett, D. "True Believers" in Dennett, D. The Intentional Stance, MIT Press, Cambridge, Mass., 1987
- Fodor, J. Psychosemantics, MIT Press, Cambridge, Mass., 1987.
- Lycan, W. Mind & Cognition, Basil Blackwell, Oxford, 1990
- Fano, Vincenzo. "Holism and the naturalization of consciousness" in Holism, Massimo Dell'Utri. Quodlibet. 2002.