The measurement problem in quantum mechanics is the problem of how (or whether) wavefunction collapse occurs. The inability to observe this process directly has given rise to different interpretations of quantum mechanics, and poses a key set of questions that each interpretation must answer. The wavefunction in quantum mechanics evolves deterministically according to the Schrödinger equation as a linear superposition of different states, but actual measurements always find the physical system in a definite state. Any future evolution is based on the state the system was discovered to be in when the measurement was made, meaning that the measurement "did something" to the system that is not obviously a consequence of Schrödinger evolution.
To express matters differently (to paraphrase Steven Weinberg), the Schrödinger wave equation determines the wavefunction at any later time. If observers and their measuring apparatus are themselves described by a deterministic wave function, why can we not predict precise results for measurements, but only probabilities? As a general question: How can one establish a correspondence between quantum and classical reality?
The best known example is the "paradox" of the Schrödinger's cat. A mechanism is arranged to kill a cat if a quantum event, such as the decay of a radioactive atom, occurs. Thus the fate of a large scale object, the cat, is entangled with the fate of a quantum object, the atom. Prior to observation, according to the Schrödinger equation, the cat is apparently evolving into a linear combination of states that can be characterized as an "alive cat" and states that can be characterized as a "dead cat". Each of these possibilities is associated with a specific nonzero probability amplitude; the cat seems to be in some kind of "combination" state called a "quantum superposition". However, a single, particular observation of the cat does not measure the probabilities: it always finds either a living cat, or a dead cat. After the measurement the cat is definitively alive or dead. The question is: How are the probabilities converted into an actual, sharply well-defined outcome?
Hugh Everett's many-worlds interpretation attempts to solve the problem by suggesting there is only one wavefunction, the superposition of the entire universe, and it never collapses—so there is no measurement problem. Instead, the act of measurement is simply an interaction between quantum entities, e.g. observer, measuring instrument, electron/positron etc., which entangle to form a single larger entity, for instance living cat/happy scientist. Everett also attempted to demonstrate the way that in measurements the probabilistic nature of quantum mechanics would appear; work later extended by Bryce DeWitt.
De Broglie–Bohm theory tries to solve the measurement problem very differently: the information describing the system contains not only the wavefunction, but also supplementary data (a trajectory) giving the position of the particle(s). The role of the wavefunction is to generate the velocity field for the particles. These velocities are such that the probability distribution for the particle remains consistent with the predictions of the orthodox quantum mechanics. According to de Broglie–Bohm theory, interaction with the environment during a measurement procedure separates the wave packets in configuration space which is where apparent wavefunction collapse comes from even though there is no actual collapse.
Erich Joos and Heinz-Dieter Zeh claim that the phenomenon of quantum decoherence, which was put on firm ground in the 1980s, resolves the problem. The idea is that the environment causes the classical appearance of macroscopic objects. Zeh further claims that decoherence makes it possible to identify the fuzzy boundary between the quantum microworld and the world where the classical intuition is applicable. Quantum decoherence was proposed in the context of the many-worlds interpretation, but it has also become an important part of some modern updates of the Copenhagen interpretation based on consistent histories. Quantum decoherence does not describe the actual process of the wavefunction collapse, but it explains the conversion of the quantum probabilities (that exhibit interference effects) to the ordinary classical probabilities. See, for example, Zurek, Zeh and Schlosshauer.
The present situation is slowly clarifying, as described in a recent paper by Schlosshauer as follows:
- Several decoherence-unrelated proposals have been put forward in the past to elucidate the meaning of probabilities and arrive at the Born rule ... It is fair to say that no decisive conclusion appears to have been reached as to the success of these derivations. ...
- As it is well known, [many papers by Bohr insist upon] the fundamental role of classical concepts. The experimental evidence for superpositions of macroscopically distinct states on increasingly large length scales counters such a dictum. Superpositions appear to be novel and individually existing states, often without any classical counterparts. Only the physical interactions between systems then determine a particular decomposition into classical states from the view of each particular system. Thus classical concepts are to be understood as locally emergent in a relative-state sense and should no longer claim a fundamental role in the physical theory.
A fourth approach is given by objective collapse models. In such models, the Schrödinger equation is modified and obtains nonlinear terms. These nonlinear modifications are of stochastic nature and lead to a behaviour which for microscopic quantum objects, e.g. electrons or atoms, is unmeasurably close to that given by the usual Schrödinger equation. For macroscopic objects, however, the nonlinear modification becomes important and induces the collapse of the wavefunction. Objective collapse models are effective theories. The stochastic modification is thought of to stem from some external non-quantum field, but the nature of this field is unknown. One possible candidate is the gravitational interaction as in the models of Diósi and Penrose. The main difference of objective collapse models compared to the other approaches is that they make falsifiable predictions that differ from standard quantum mechanics. Experiments are already getting close to the parameter regime where these predictions can be tested.
An interesting solution to the measurement problem is also provided by the hidden-measurements interpretation of quantum mechanics. The hypothesis at the basis of this approach is that in a typical quantum measurement there is a condition of lack of knowledge about which interaction between the measured entity and the measuring apparatus is actualized at each run of the experiment. One can then show that the Born rule can be derived by considering a uniform average over all these possible measurement-interactions. 
References and notes
- Steven Weinberg (1998). The Oxford History of the Twentieth Century (Michael Howard & William Roger Louis, editors ed.). Oxford University Press. p. 26. ISBN 0-19-820428-0.
- Steven Weinberg: Einstein's Mistakes in Physics Today (2005); see subsection "Contra quantum mechanics"
- Wojciech Hubert Zurek Decoherence, einselection, and the quantum origins of the classical Reviews of Modern Physics, Vol. 75, July 2003
- Joos, E., and H. D. Zeh, "The emergence of classical properties through interaction with the environment" (1985), Z. Phys. B 59, 223.
- H D Zeh in E. Joos .... (2003). Decoherence and the Appearance of a Classical World in Quantum Theory (2nd Edition; Erich Joos, H. D. Zeh, C. Kiefer, Domenico Giulini, J. Kupsch, I. O. Stamatescu (editors) ed.). Springer-Verlag. Chapter 2. ISBN 3-540-00390-8.
- Jaeger, Gregg (September 2014). "What in the (quantum) world is macroscopic?". American Journal of Physics 82 (9): 896–905. Bibcode:2014AmJPh..82..896J. doi:10.1119/1.4878358.
- V. P. Belavkin (1994). "Nondemolition principle of quantum measurement theory". Foundations of Physics 24 (5): 685–714. arXiv:quant-ph/0512188. Bibcode:1994FoPh...24..685B. doi:10.1007/BF02054669.
- V. P. Belavkin (2001). "Quantum noise, bits and jumps: uncertainties, decoherence, measurements and filtering". Progress in Quantum Electronics 25 (1): 1–53. arXiv:quant-ph/0512208. Bibcode:2001PQE....25....1B. doi:10.1016/S0079-6727(00)00011-2.
- Maximilian Schlosshauer (2005). "Decoherence, the measurement problem, and interpretations of quantum mechanics". Rev. Mod. Phys. 76 (4): 1267–1305. arXiv:quant-ph/0312059. Bibcode:2004RvMP...76.1267S. doi:10.1103/RevModPhys.76.1267.
- Maximilian Schlosshauer (January 2006). "Experimental motivation and empirical consistency in minimal no-collapse quantum mechanics". Annals of Physics 321 (1): 112–149. arXiv:quant-ph/0506199. Bibcode:2006AnPhy.321..112S. doi:10.1016/j.aop.2005.10.004.
- Angelo Bassi; Kinjalk Lochan; Seema Satin; Tejinder P. Singh; Hendrik Ulbricht (2013). "Models of wave-function collapse, underlying theories, and experimental tests". Reviews of Modern Physics 85: 471–527. arXiv:1204.4325. Bibcode:2013RvMP...85..471B. doi:10.1103/RevModPhys.85.471.
- Aerts, D. (1986). A possible explanation for the probabilities of quantum mechanics, Journal of Mathematical Physics, 27, pp. 202-210.
- Aerts, D. and Sassoli de Bianchi, M. (2014). The extended Bloch representation of quantum mechanics and the hidden-measurement solution to the measurement problem. Annals of Physics 351, Pages 975–1025 (2014)
- R. Buniy, S. Hsu and A. Zee On the origin of probability in quantum mechanics (2006)
- Walker, Evan (2000). The Physics of Consciousness. Cambridge: Perseus. ISBN 0-7382-0436-6. The book's author makes leading contributions to the measurement problem, observer theory, and consciousness.
- The Quantum Measurement Problem Two presentations: a non-technical and a more technical presentation.