Measure problem (cosmology)

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search

The measure problem in cosmology concerns how to compute fractions of universes of different types within a multiverse. It typically arises in the context of eternal inflation. The problem arises because different approaches to calculating these fractions yield different results, and it is not clear which approach (if any) is correct.[1]

Measures can be evaluated by whether they predict observed physical constants, as well as whether they avoid counterintuitive implications, such as the youngness paradox or Boltzmann brains.[2] While dozens of measures have been proposed,[3]: 2  few physicists consider the problem to be solved.[4]

The problem[edit]

Infinite multiverse theories are becoming increasingly popular, but because they involve infinitely many instances of different types of universes, it's unclear how to compute the fractions of each type of universe.[4] Alan Guth put it this way:[4]

In a single universe, cows born with two heads are rarer than cows born with one head. [But in an infinitely branching multiverse] there are an infinite number of one-headed cows and an infinite number of two-headed cows. What happens to the ratio?

Sean M. Carroll offered another informal example:[1]

Say there are an infinite number of universes in which George W. Bush became President in 2000, and also an infinite number in which Al Gore became President in 2000. To calculate the fraction N(Bush)/N(Gore), we need to have a measure — a way of taming those infinities. Usually this is done by “regularization.” We start with a small piece of universe where all the numbers are finite, calculate the fraction, and then let our piece get bigger, and calculate the limit that our fraction approaches.

Different procedures for computing the limit of this fraction yield wildly different answers.[1]

One way to illustrate how different regularization methods produce different answers is to calculate the limit of the fraction of sets of positive integers that are even. Suppose the integers are ordered the usual way,

1, 2, 3, 4, 5, 6, 7, 8, ... (OEISA000027)

At a cutoff of "the first five elements of the list", the fraction is 2/5; at a cutoff of "the first six elements" the fraction is 1/2; the limit of the fraction, as the subset grows, converges to 1/2. However, if the integers are ordered such that any two consecutive odd numbers are separated by two even numbers,

1, 2, 4, 3, 6, 8, 5, 10, 12, 7, 14, 16, ... (OEISA265667)

the limit of the fraction of integers that are even converges to 2/3 rather than 1/2.[5]

A popular way to decide what ordering to use in regularization is to pick the simplest or most natural-seeming method of ordering. Everyone agrees that the first sequence, ordered by increasing size of the integers, seems more natural. Similarly, many physicists agree that the "proper-time cutoff measure" (below) seems the simplest and most natural method of regularization. Unfortunately, the proper-time cutoff measure seems to produce incorrect results.[3]: 2 [5]

The measure problem is important in cosmology because in order to compare cosmological theories in an infinite multiverse, we need to know which types of universes they predict to be more common than others.[4]

Proposed measures[edit]

In this toy multiverse, the left-hand region exits inflation (red line) later than the right-hand region does. With the proper-time cutoff shown by the black dotted lines, the immediately post-inflation portion of the left-hand universe dominates the measure, flooding the measure with five "Boltzmann babies" (red) that are freakishly young. Extending the proper-time cutoff to later times does not help, as other regions (not pictured) that exit inflation even later would then dominate. With the scale-factor cutoff shown by the gray dotted lines, only observers who exist before the region has expanded by the scale factor are counted, giving normal observers (blue) time to dominate the measure, while the left-hand universe hits the scale cutoff even before it exits inflation in this example.[3]

Proper-time cutoff[edit]

The proper-time cutoff measure considers the probability of finding a given scalar field at a given proper time .[3]: 1–2  During inflation, the region around a point grows like in a small proper-time interval ,[3]: 1  where is the Hubble parameter.

This measure has the advantage of being stationary in the sense that probabilities remain the same over time in the limit of large .[3]: 1  However, it suffers from the youngness paradox, which has the effect of making it exponentially more probable that we'd be in regions of high temperature, in conflict with what we observe; this is because regions that exited inflation later than our region, spent more time than us experiencing runaway inflationary exponential growth.[3]: 2  For example, observers in a Universe of 13.8 billion years old (our observed age) are outnumbered by observers in a 13.0 billion year old Universe by a factor of . This lopsidedness continues, until the most numerous observers resembling us are "Boltzmann babies" formed by improbable fluctuations in the hot, very early, Universe. Therefore, physicists reject the simple proper-time cutoff as a failed hypothesis.[6]

Scale-factor cutoff[edit]

Time can be parameterized in different ways than proper time.[3]: 1  One choice is to parameterize by the scale factor of space , or more commonly by .[3]: 1  Then a given region of space expands as , independent of .[3]: 1 

This approach can be generalized to a family of measures in which a small region grows as for some and time-slicing approach .[3]: 1–2  Any choice for remains stationary for large times.

The scale-factor cutoff measure takes , which avoids the youngness paradox by not giving greater weight to regions that retain high energy density for long periods.[3]: 2 

This measure is very sensitive to the choice of because any yields the youngness paradox, while any yields an "oldness paradox" in which most life is predicted to exist in cold, empty space as Boltzmann brains rather than as the evolved creatures with orderly experiences that we seem to be.[3]: 2 

De Simone et al. (2010) consider the scale-factor cutoff measure to be a very promising solution to the measure problem.[7] This measure has also been shown to produce good agreement with observational values of the cosmological constant.[8]


The stationary measure proceeds from the observation that different processes achieve stationarity of at different times.[3]: 2  Thus, rather than comparing processes at a given time since the beginning, the stationary measure compares them in terms of time since each process individually become stationary.[3]: 2  For instance, different regions of the universe can be compared based on time since star formation began.[3]: 3 

Andrei Linde and coauthors have suggested that the stationary measure avoids both the youngness paradox and Boltzmann brains.[2] However, the stationary measure predicts extreme (either very large or very small) values of the primordial density contrast and the gravitational constant , inconsistent with observations.[7]: 2 

Causal diamond[edit]

Reheating marks the end of inflation. The causal diamond is the finite four-volume formed by intersecting the future light cone of an observer crossing the reheating hypersurface with the past light cone of the point where the observer has exited a given vacuum.[3]: 2  Put another way, the causal diamond is[4]

the largest swath accessible to a single observer traveling from the beginning of time to the end of time. The finite boundaries of a causal diamond are formed by the intersection of two cones of light, like the dispersing rays from a pair of flashlights pointed toward each other in the dark. One cone points outward from the moment matter was created after a Big Bang — the earliest conceivable birth of an observer — and the other aims backward from the farthest reach of our future horizon, the moment when the causal diamond becomes an empty, timeless void and the observer can no longer access information linking cause to effect.

The causal diamond measure multiplies the following quantities:[9]: 1, 4 

  • the prior probability that a world line enters a given vacuum
  • the probability that observers emerge in that vacuum, approximated as the difference in entropy between exiting and entering the diamond. ("[T]he more free energy, the more likely it is that observers will emerge.")

Different prior probabilities of vacuum types yield different results.[3]: 2  Entropy production can be approximated as the number of galaxies in the diamond.[3]: 2 

An attraction of this approach is that it avoids comparing infinities, which is the original source of the measure problem.[4]


The watcher measure imagines the world line of an eternal "watcher" that passes through an infinite number of Big Crunch singularities.[10]

Guth-Vanchurin paradox[edit]

In all "cutoff" schemes for an expanding infinite multiverse, a finite percentage of observers reach the cutoff during their lifetimes. Under most schemes, if a current observer is still alive five billion years from now, then the later stages of their life must somehow be "discounted" by a factor of around two compared to their current stages of life. For such an observer, Bayes' theorem may appear to break down over this timescale due to anthropic selection effects; this hypothetical breakdown is sometimes called the "Guth-Vanchurin paradox". One proposed resolution to the paradox is to posit a physical "end of time" that has a fifty percent chance of occurring in the next few billion years. Another, overlapping, proposal is to posit that an observer no longer physically exists when it passes outside a given causal patch, similar to models where a particle is destroyed or ceases to exist when it falls through a black hole's event horizon.[11][12] Guth and Vanchurin have pushed back on such "end of time" proposals, stating that while "(later) stages of my life will contribute (less) to multiversal averages" than earlier stages, this paradox need not be interpreted as a physical "end of time". The literature proposes at least five possible resolutions:[13][14]

  1. Accept a physical "end of time"
  2. Reject that probabilities in a finite universe are given by relative frequencies of events or histories
  3. Reject calculating probabilities via a geometric cutoff
  4. Reject standard probability theories, and instead posit that "relative probability" is, axiomatically, the limit of a certain geometric cutoff process
  5. Reject eternal inflation

Guth and Vanchurin hypothesize that standard probability theories might be incorrect, which would have counterintuitive consequences.[14]

See also[edit]


  1. ^ a b c Carroll, Sean (21 Oct 2011). "The Eternally Existing, Self-Reproducing, Frequently Puzzling Inflationary Universe". Discover. Retrieved 8 January 2015.
  2. ^ a b Andrei Linde; Vitaly Vanchurin; Sergei Winitzki (15 Jan 2009). "Stationary Measure in the Multiverse". Journal of Cosmology and Astroparticle Physics. 2009 (1): 031. arXiv:0812.0005. Bibcode:2009JCAP...01..031L. doi:10.1088/1475-7516/2009/01/031. S2CID 119269055.
  3. ^ a b c d e f g h i j k l m n o p q r s Andrei Linde; Mahdiyar Noorbala (9 Sep 2010). "Measure problem for eternal and non-eternal inflation". Journal of Cosmology and Astroparticle Physics. 2010 (9): 008. arXiv:1006.2170. Bibcode:2010JCAP...09..008L. doi:10.1088/1475-7516/2010/09/008. S2CID 119226491.
  4. ^ a b c d e f Natalie Wolchover; Peter Byrne (3 Nov 2014). "In a Multiverse, What Are the Odds?". Retrieved 8 January 2015.
  5. ^ a b Tegmark, Max (2014). "Chapter 11". Our Mathematical Universe: My Quest for the Ultimate Nature of Reality. Alfred A. Knopf. ISBN 9780307744258.
  6. ^ Bousso, R., Freivogel, B., & Yang, I. S. (2008). Boltzmann babies in the proper time measure. Physical Review D, 77(10), 103514.
  7. ^ a b Andrea De Simone; Alan H. Guth; Andrei Linde; Mahdiyar Noorbala; Michael P. Salem; Alexander Vilenkin (14 Sep 2010). "Boltzmann brains and the scale-factor cutoff measure of the multiverse". Phys. Rev. D. 82 (6): 063520. arXiv:0808.3778. Bibcode:2010PhRvD..82f3520D. doi:10.1103/PhysRevD.82.063520. S2CID 17348306.
  8. ^ Andrea De Simone; Alan H. Guth; Michael P. Salem; Alexander Vilenkin (12 September 2008). "Predicting the cosmological constant with the scale-factor cutoff measure". Phys. Rev. D. 78 (6): 063520. arXiv:0805.2173. Bibcode:2008PhRvD..78f3520D. doi:10.1103/PhysRevD.78.063520. S2CID 118731152.
  9. ^ Raphael Bousso (6 November 2006). "Holographic probabilities in eternal inflation". Phys. Rev. Lett. 97 (19): 191302. arXiv:hep-th/0605263. Bibcode:2006PhRvL..97s1302B. doi:10.1103/PhysRevLett.97.191302. PMID 17155610. S2CID 977375.
  10. ^ Jaume Garriga; Alexander Vilenkin (24 Apr 2013). "Watchers of the multiverse". Journal of Cosmology and Astroparticle Physics. 2013 (5): 037. arXiv:1210.7540. Bibcode:2013JCAP...05..037G. doi:10.1088/1475-7516/2013/05/037. S2CID 118444431.
  11. ^ Courtland, Rachel (2010). "Countdown to oblivion: Why time itself could end". New Scientist. Retrieved 4 November 2018.
  12. ^ Freivogel, Ben (21 October 2011). "Making predictions in the multiverse". Classical and Quantum Gravity. 28 (20): 204007. arXiv:1105.0244. Bibcode:2011CQGra..28t4007F. doi:10.1088/0264-9381/28/20/204007. S2CID 43365582.
  13. ^ Gefter, Amanda (2011). "Time need not end in the multiverse". New Scientist. Retrieved 25 March 2020.
  14. ^ a b Guth, Alan H., and Vitaly Vanchurin. "Eternal Inflation, Global Time Cutoff Measures, and a Probability Paradox." arXiv preprint arXiv:1108.0665 (2011).