Planning fallacy

From Wikipedia, the free encyclopedia
Jump to: navigation, search
Daniel Kahneman who, along with Amos Tversky, proposed the fallacy.

The planning fallacy is a tendency for people and organizations to underestimate how long they will need to complete a task, even when they have experience of similar tasks over-running. The term was first proposed in a 1979 paper by Daniel Kahneman and Amos Tversky.[1][2] Since then the effect has been found for predictions of a wide variety of tasks, including tax form completion, school work, furniture assembly, computer programming and origami.[1][3] In 2003, Lovallo and Kahneman proposed an expanded definition as the tendency to underestimate the time, costs, and risks of future actions and at the same time overestimate the benefits of the same actions. According to this definition, the planning fallacy results in not only time overruns, but also cost overruns and benefit shortfalls.[4] The bias only affects predictions about one's own tasks; when uninvolved observers predict task completion times, they show a pessimistic bias, overestimating the time taken.[3][5]

Demonstration[edit]

In a 1994 study, 37 psychology students were asked to estimate how long it would take to finish their senior theses. The average estimate was 33.9 days. They also estimated how long it would take "if everything went as well as it possibly could" (averaging 27.4 days) and "if everything went as poorly as it possibly could" (averaging 48.6 days). The average actual completion time was 55.5 days, with only about 30% of the students completing their thesis in the amount of time they predicted.[6]

Another study asked students to estimate when they would complete their personal academic projects. Specifically, the researchers asked for estimated times by which the students thought it was 50%, 75%, and 99% probable their personal projects would be done.[5]

  • 13% of subjects finished their project by the time they had assigned a 50% probability level;
  • 19% finished by the time assigned a 75% probability level;
  • 45% finished by the time of their 99% probability level.

A survey of Canadian tax payers, published in 1997, found that they mailed in their tax forms about a week later than they predicted. They had no misconceptions about their past record of getting forms mailed in, but expected that they would get it done more quickly next time.[7] This illustrates a defining feature of the planning fallacy; that people recognize that their past predictions have been over-optimistic, while insisting that their current predictions are realistic.[3]

Explanations[edit]

Kahneman and Tversky originally explained the fallacy by envisaging that planners focus on the most optimistic scenario for the task, rather than using their full experience of how much time similar tasks require.[3]

Roger Buehler and colleagues account for the fallacy by examining wishful thinking; in other words, people think tasks will be finished quickly and easily because that is what they want to be the case.[1]

In a different paper, Buehler and colleagues suggest an explanation in terms of the self-serving bias in how people interpret their past performance. By taking credit for tasks that went well but blaming delays on outside influences, people can discount past evidence of how long a task should take.[1] One experiment found that when people made their predictions anonymously, they do not show the optimistic bias. This suggests that the people make optimistic estimates so as to create a favorable impression with others,[8] which is similar to the concepts outlined in impression management theory.

One explanation, focalism, may account for the mental discounting of off-project risks. People formulating the plan may eliminate factors they perceive to lie outside the specifics of the project. Additionally, they may discount multiple improbable high-impact risks because each one is so unlikely to happen.[citation needed]

Planners tend to focus on the project and underestimate time for sickness, vacation, meetings, and other "overhead" tasks.[citation needed] Planners also tend not to plan projects to a detail level that allows estimation of individual tasks, like placing one brick in one wall; this enhances optimism bias and prohibits use of actual metrics, like timing the placing of an average brick and multiplying by the number of bricks.[citation needed]

Complex projects that lack immutable goals may become subject to mission creep, scope creep, and featuritis.[citation needed]

As described by Fred Brooks in The Mythical Man-Month, adding new personnel to an already-late project incurs a variety of risks and overhead costs that tend to make it even later; this is known as Brooks's law.

The "authorization imperative" offers another possible explanation: much of project planning takes place in a context which requires financial approval to proceed with the project, and the planner often has a stake in getting the project approved. This dynamic may lead to a tendency on the part of the planner to deliberately underestimate the project effort required. It is easier to get forgiveness (for overruns) than permission (to commence the project if a realistic effort estimate were provided.) Such deliberate underestimation has been named[by whom?] "strategic misrepresentation".[9]

Apart from psychological explanations, the phenomenon of the planning fallacy has also been explained[by whom?] as resulting from natural asymmetry and from scaling issues. The asymmetry results from random events giving negative results of delay or cost, not evenly balanced between positive and negative results. The scaling difficulties relate to the observation that consequences of disruptions are not linear, that as size of effort increases the error increases much more as a natural effect of inefficiencies of larger efforts' ability to react, particularly efforts that are not divisible in increments. Additionally this is contrasted with earlier efforts being more commonly on-time (e.g. the Empire State Building, The Crystal Palace, the Golden Gate Bridge) to conclude it indicates inherent flaws in more modern planning systems and modern efforts having hidden fragility. (for example, that modern efforts - being computerized and less localized invisibly - have less insight and control, and more dependencies on transportation.)[10]

See also[edit]

Notes[edit]

  1. ^ a b c d Pezzo, Mark V.; Litman, Jordan A.; Pezzo, Stephanie P. (2006). "On the distinction between yuppies and hippies: Individual differences in prediction biases for planning future tasks". Personality and Individual Differences 41 (7): 1359–1371. doi:10.1016/j.paid.2006.03.029. ISSN 0191-8869. 
  2. ^ Kahneman, Daniel; Tversky, Amos (1979). "Intuitive prediction: biases and corrective procedures". TIMS Studies in Management Science 12: 313–327. 
  3. ^ a b c d Buehler, Roger; Griffin, Dale, & Ross, Michael (2002). "Inside the planning fallacy: The causes and consequences of optimistic time predictions". In Thomas Gilovich, Dale Griffin, & Daniel Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment, pp. 250–270. Cambridge, UK: Cambridge University Press.
  4. ^ Lovallo, Dan; Daniel Kahneman (July 2003). "Delusions of Success: How Optimism Undermines Executives' Decisions". Harvard Business Review: 56–63. 
  5. ^ a b Buehler, Roger; Dale Griffin, Michael Ross (1995). "It's about time: Optimistic predictions in work and love". European Review of Social Psychology (American Psychological Association) 6: 1–32. doi:10.1080/14792779343000112. 
  6. ^ Buehler, Roger; Dale Griffin, Michael Ross (1994). "Exploring the "planning fallacy": Why people underestimate their task completion times". Journal of Personality and Social Psychology (American Psychological Association) 67 (3): 366–381. doi:10.1037/0022-3514.67.3.366. 
  7. ^ Buehler, Roger; Dale Griffin, Johanna Peetz (2010). "The Planning Fallacy: Cognitive, Motivational, and Social Origins". Advances in Experimental Social Psychology (Academic Press) 43: 9. Retrieved 2012-09-15. 
  8. ^ Stephanie P. Pezzoa. Mark V. Pezzob, and Eric R. Stone. "The social implications of planning: How public predictions bias future plans" Journal of Experimental Social Psychology, 2006, 221–227
  9. ^ Jones,, Larry R; Euske, Kenneth J (October 1991). "Strategic misrepresentation in budgeting". Journal of Public Administration Research and Theory (Oxford University Press) 1 (4): 437–460. Retrieved 11 March 2013. 
  10. ^ Taleb, Nassem (2012-11-27). Antifragile: Things That Gain from Disorder. ISBN 9781400067824. 

References[edit]

Further reading[edit]