|This article needs additional citations for verification. (October 2007)|
||This article needs attention from an expert in Philosophy/Epistemology. (February 2009)|
The propensity theory of probability is one interpretation of the concept of probability. Theorists who adopt this interpretation think of probability as a physical propensity, or disposition, or tendency of a given type of physical situation to yield an outcome of a certain kind, or to yield a long run relative frequency of such an outcome.
Propensities are not relative frequencies, but purported causes of the observed stable relative frequencies. Propensities are invoked to explain why repeating a certain kind of experiment will generate a given outcome type at a persistent rate. A central aspect of this explanation is the law of large numbers. This law, which is a consequence of the axioms of probability, says that if (for example) a coin is tossed repeatedly many times, in such a way that its probability of landing heads is the same on each toss, and the outcomes are probabilistically independent, then the relative frequency of heads will (with high probability) be close to the probability of heads on each single toss. This law suggests that stable long-run frequencies are a manifestation of invariant single-case probabilities. Frequentists are unable to take this approach, since relative frequencies do not exist for single tosses of a coin, but only for large ensembles or collectives. Hence, these single-case probabilities are known as propensities or chances.
In addition to explaining the emergence of stable relative frequencies, the idea of propensity is motivated by the desire to make sense of single-case probability attributions in quantum mechanics, such as the probability of decay of a particular atom at a particular time.
The main challenge facing propensity theories is to say exactly what propensity means. And then, of course, to show that propensity thus defined has the required properties. At present, unfortunately, none of the well-recognised accounts of propensity comes close to meeting this challenge.
A later propensity theory was proposed by philosopher Karl Popper, who had only slight acquaintance with the writings of Charles S. Peirce, however. Popper noted that the outcome of a physical experiment is produced by a certain set of "generating conditions". When we repeat an experiment, as the saying goes, we really perform another experiment with a (more or less) similar set of generating conditions. To say that a set of generating conditions has propensity p of producing the outcome E means that those exact conditions, if repeated indefinitely, would produce an outcome sequence in which E occurred with limiting relative frequency p. For Popper then, a deterministic experiment would have propensity 0 or 1 for each outcome, since those generating conditions would have same outcome on each trial. In other words, non-trivial propensities (those that differ from 0 and 1) only exist for genuinely indeterministic experiments.
Popper's propensities, while they are not relative frequencies, are yet defined in terms of relative frequency. As a result, they face many of the serious problems that plague frequency theories. First, propensities cannot be empirically ascertained, on this account, since the limit of a sequence is a tail event, and is thus independent of its finite initial segments. Seeing a coin land heads every time for the first million tosses, for example, tells one nothing about the limiting proportion of heads on Popper's view. Moreover, the use of relative frequency to define propensity assumes the existence of stable relative frequencies, so one cannot then use propensity to explain the existence of stable relative frequencies, via the Law of large numbers.
A number of other philosophers, including David Miller and Donald A. Gillies, have proposed propensity theories somewhat similar to Popper's, in that propensities are defined in terms of either long-run or infinitely long-run relative frequencies.
Other propensity theorists (e.g. Ronald Giere ) do not explicitly define propensities at all, but rather see propensity as defined by the theoretical role it plays in science. They argue, for example, that physical magnitudes such as electrical charge cannot be explicitly defined either, in terms of more basic things, but only in terms of what they do (such as attracting and repelling other electrical charges). In a similar way, propensity is whatever fills the various roles that physical probability plays in science.
Principal Principle of David Lewis
What roles does physical probability play in science? What are its properties? One central property of chance is that, when known, it constrains rational belief to take the same numerical value. David Lewis called this the Principal Principle, a term that philosophers have mostly adopted. For example, suppose you are certain that a particular biased coin has propensity 0.32 to land heads every time it is tossed. What is then the correct price for a gamble that pays $1 if the coin lands heads, and nothing otherwise? According to the Principal Principle, the fair price is 32 cents.
- 'Interpretations of Probability', Stanford Encyclopedia of Philosophy . Retrieved 23 December 2006.
- Miller, Richard W. (1975). "Propensity: Popper or Peirce?". British Journal for the Philosophy of Science 26 (2): 123–132. doi:10.1093/bjps/26.2.123.
- Haack, Susan; Kolenda, Konstantin, Konstantin; Kolenda (1977). "Two Fallibilists in Search of the Truth". Proceedings of the Aristotelian Society 51 (Supplementary Volumes): 63–104. JSTOR 4106816.
- Burks, Arthur W. (1978). Chance, Cause and Reason: An Inquiry into the Nature of Scientific Evidence. University of Chicago Press. pp. 694 pages. ISBN 0-226-08087-0.
- Peirce, Charles Sanders and Burks, Arthur W., ed. (1958), the Collected Papers of Charles Sanders Peirce Volumes 7 and 8, Harvard University Press, Cambridge, MA, also Belknap Press (of Harvard University Press) edition, vols. 7-8 bound together, 798 pages, online via InteLex, reprinted in 1998 Thoemmes Continuum.
- Giere, R. (1973). “Objective single case probabilities and the foundations of statistics”
- Mellor, D.: The Matter of Chance
- Hacking, I. 1965. Logic of Statistical Inference. Cambridge: Cambridge University Press.
- A Subjectivist's Guide to Objective Chance, Philosophical Papers of David Lewis, Volume 2, Oxford: Oxford University Press, 1986, pp. 83–132.
- Peirce, Charles Sanders. the Collected Papers of Charles Sanders Peirce, Arthur W. Burks ed., (1958), Volumes 7 and 8, Harvard University Press, Cambridge, MA, also Belknap Press (of Harvard University Press) edition, vols. 7-8 bound together, 798 pages, online via InteLex, reprinted in 1998 Thoemmes Continuum.
- Burks, Arthur W. (1978), Chance, Cause and Reason: An Inquiry into the Nature of Scientific Evidence, University of Chicago Press, 694 pages.
- The Self and Its Brain: An Argument for Interactionism. Popper, Karl and Eccles, Sir John. 1977, ISBN 0-415-05898-8
- The Propensity Interpretation of the Calculus of Probability and of the Quantum Theory. Popper, Karl. In Observation and Interpretation. Buttersworth Scientific Publications, Korner & Price (eds.) 1957. pp 65–70.
- The Logic of Scientific Discovery. Popper, Karl. Hutchinson, London. 1959
- Quantum Mechanics without "The Observer". Popper, Karl. In Quantum Theory and Reality. Springer-Verlag, Berlin, Heidelberg, New York. Bunge, M. (ed.). 1967
- Philosophical Theories of Probability. Gillies, Donald. Routledge. 2000.
- Objective Single-Case Probabilities and the Foundations of Statistics. Giere, R. N, in Logic, Methodology and Philosophy of Science IV, P. Suppes, et al., (eds.), New York: North-Holland. 1973
- A Subjectivist's Guide to Objective Chance. Lewis, David. In Richard C. Jeffrey (ed.), Studies in Inductive Logic and Probability, Vol. II. Berkeley: University of Berkeley Press, 263-293. Reprinted with Postscripts in David Lewis (1986), Philosophical Papers. Vol. II. Oxford: Oxford University Press, 83-132