Measure Problem (cosmology)
   HOME

TheInfoList



OR:

The measure problem in
cosmology Cosmology () is a branch of physics and metaphysics dealing with the nature of the universe, the cosmos. The term ''cosmology'' was first used in English in 1656 in Thomas Blount's ''Glossographia'', with the meaning of "a speaking of the wo ...
concerns how to compute the ratios of universes of different types within a
multiverse The multiverse is the hypothetical set of all universes. Together, these universes are presumed to comprise everything that exists: the entirety of space, time, matter, energy, information, and the physical laws and constants that describ ...
. It typically arises in the context of
eternal inflation Eternal inflation is a hypothetical inflationary universe model, which is itself an outgrowth or extension of the Big Bang theory. According to eternal inflation, the inflationary phase of the universe's expansion lasts forever throughout most ...
. The problem arises because different approaches to calculating these ratios yield different results, and it is not clear which approach (if any) is correct. Measures can be evaluated by whether they predict observed physical constants, as well as whether they avoid counterintuitive implications, such as the ''youngness paradox'' or
Boltzmann brain The Boltzmann brain thought experiment suggests that it is probably more likely for a brain to spontaneously form, complete with a memory of having existed in our universe, rather than for the entire universe to come about in the manner cosmolo ...
s. While dozens of measures have been proposed, few physicists consider the problem to be solved.


The problem

Infinite multiverse theories are becoming increasingly popular, but because they involve infinitely many instances of different types of universes, it is unclear how to compute the fractions of each type of universe.
Alan Guth Alan Harvey Guth (; born February 27, 1947) is an American theoretical physicist and cosmologist who is the Victor Weisskopf Professor of Physics at the Massachusetts Institute of Technology. Along with Alexei Starobinsky and Andrei Linde, ...
put it this way: : In a single universe, cows born with two heads are rarer than cows born with one head. ut in an infinitely branching multiversethere are an infinite number of one-headed cows and an infinite number of two-headed cows. What happens to the ratio?
Sean M. Carroll Sean Michael Carroll (born October 5, 1966) is an American theoretical physicist who specializes in quantum mechanics, cosmology, and the philosophy of science. He is the Homewood Professor of Natural Philosophy at Johns Hopkins University. He ...
offered another informal example: : Say there are an infinite number of universes in which
George W. Bush George Walker Bush (born July 6, 1946) is an American politician and businessman who was the 43rd president of the United States from 2001 to 2009. A member of the Bush family and the Republican Party (United States), Republican Party, he i ...
became President in 2000, and also an infinite number in which
Al Gore Albert Arnold Gore Jr. (born March 31, 1948) is an American former politician, businessman, and environmentalist who served as the 45th vice president of the United States from 1993 to 2001 under President Bill Clinton. He previously served as ...
became President in 2000. To calculate the fraction N(Bush)/N(Gore), we need to have a measure – a way of taming those infinities. Usually this is done by “regularization.” We start with a small piece of universe where all the numbers are finite, calculate the fraction, and then let our piece get bigger, and calculate the limit that our fraction approaches. Different procedures for computing the limit of this fraction yield wildly different answers. One way to illustrate how different regularization methods produce different answers is to calculate the limit of the fraction of sets of positive integers that are even. Suppose the integers are ordered the usual way, : 1, 2, 3, 4, 5, 6, 7, 8, ... () At a ''cutoff'' of "the first five elements of the list", the fraction is 2/5; at a cutoff of "the first six elements" the fraction is 1/2; the limit of the fraction, as the subset grows, converges to 1/2. However, if the integers are ordered such that any odd number is followed by two consecutive even numbers, : 1, 2, 4, 3, 6, 8, 5, 10, 12, 7, 14, 16, ... () the limit of the fraction of integers that are even converges to 2/3 rather than 1/2. A popular way to decide what ordering to use in regularization is to pick the simplest or most natural-seeming method of ordering. Everyone agrees that the first sequence, ordered by increasing size of the integers, seems more natural. Similarly, many physicists agree that the "proper-time cutoff measure" (below) seems the simplest and most natural method of regularization. Unfortunately, the proper-time cutoff measure seems to produce incorrect results. The measure problem is important in cosmology because in order to compare cosmological theories in an infinite multiverse, we need to know which types of universes they predict to be more common than others.


Proposed measures


Proper-time cutoff

The ''proper-time cutoff measure'' considers the probability P(\phi, t) of finding a given scalar field \phi at a given
proper time In relativity, proper time (from Latin, meaning ''own time'') along a timelike world line is defined as the time as measured by a clock following that line. The proper time interval between two events on a world line is the change in proper time ...
t. During
inflation In economics, inflation is an increase in the average price of goods and services in terms of money. This increase is measured using a price index, typically a consumer price index (CPI). When the general price level rises, each unit of curre ...
, the region around a point grows like e^ in a small proper-time interval \Delta t, where H is the
Hubble parameter Hubble's law, also known as the Hubble–Lemaître law, is the observation in physical cosmology that galaxies are moving away from Earth at speeds proportional to their distance. In other words, the farther a galaxy is from the Earth, the faster ...
. This measure has the advantage of being stationary in the sense that probabilities remain the same over time in the limit of large t. However, it suffers from the ''youngness paradox'', which has the effect of making it exponentially more probable that we would be in regions of high temperature, in conflict with what we observe; this is because regions that exited inflation later than our region, spent more time than us experiencing runaway inflationary exponential growth. For example, observers in a Universe of 13.8 billion years old (our observed age) are outnumbered by observers in a 13.0 billion year old Universe by a factor of 10^. This lopsidedness continues, until the most numerous observers resembling us are "Boltzmann babies" formed by improbable fluctuations in the hot, very early, Universe. Therefore, physicists reject the simple proper-time cutoff as a failed hypothesis.


Scale-factor cutoff

Time can be parameterized in different ways than proper time. One choice is to parameterize by the scale factor of space a, or more commonly by \eta \sim \log a. Then a given region of space expands as e^, independent of H. This approach can be generalized to a family of measures in which a small region grows as e^ for some \beta and time-slicing approach t_\beta. Any choice for \beta remains stationary for large times. The ''scale-factor cutoff measure'' takes \beta = 0, which avoids the youngness paradox by not giving greater weight to regions that retain high energy density for long periods. This measure is very sensitive to the choice of \beta because any \beta > 0 yields the youngness paradox, while any \beta < 0 yields an "oldness paradox" in which most life is predicted to exist in cold, empty space as Boltzmann brains rather than as the evolved creatures with orderly experiences that we seem to be. De Simone et al. (2010) consider the scale-factor cutoff measure to be a promising solution to the measure problem. This measure has also been shown to produce good agreement with observational values of the
cosmological constant In cosmology, the cosmological constant (usually denoted by the Greek capital letter lambda: ), alternatively called Einstein's cosmological constant, is a coefficient that Albert Einstein initially added to his field equations of general rel ...
.


Stationary

The stationary measure proceeds from the observation that different processes achieve stationarity of P(\phi, t) at different times. Thus, rather than comparing processes at a given time since the beginning, the stationary measure compares them in terms of time since each process individually become stationary. For instance, different regions of the universe can be compared based on time since star formation began.
Andrei Linde Andrei Dmitriyevich Linde (; born March 2, 1948) is a Russian-American theoretical physicist and the Harald Trap Friis Professor of Physics at Stanford University. Linde is one of the main authors of the inflationary universe theory, as well ...
and coauthors have suggested that the stationary measure avoids both the youngness paradox and Boltzmann brains. However, the stationary measure predicts extreme (either very large or very small) values of the primordial density contrast Q and the
gravitational constant The gravitational constant is an empirical physical constant involved in the calculation of gravitational effects in Sir Isaac Newton's law of universal gravitation and in Albert Einstein's general relativity, theory of general relativity. It ...
G, inconsistent with observations.


Causal diamond

Reheating marks the end of inflation. The '' causal diamond'' is the finite four-volume formed by intersecting the future
light cone In special and general relativity, a light cone (or "null cone") is the path that a flash of light, emanating from a single Event (relativity), event (localized to a single point in space and a single moment in time) and traveling in all direct ...
of an observer crossing the reheating hypersurface with the past light cone of the point where the observer has exited a given vacuum. Put another way, the causal diamond is : the largest swath accessible to a single observer traveling from the beginning of time to the end of time. The finite boundaries of a causal diamond are formed by the intersection of two cones of light, like the dispersing rays from a pair of flashlights pointed toward each other in the dark. One cone points outward from the moment matter was created after a Big Bang—the earliest conceivable birth of an observer—and the other aims backward from the farthest reach of our future horizon, the moment when the causal diamond becomes an empty, timeless void and the observer can no longer access information linking cause to effect. The ''causal diamond measure'' multiplies the following quantities: * the prior probability that a
world line The world line (or worldline) of an object is the path that an object traces in 4-dimensional spacetime. It is an important concept of modern physics, and particularly theoretical physics. The concept of a "world line" is distinguished from c ...
enters a given vacuum * the probability that observers emerge in that vacuum, approximated as the difference in entropy between exiting and entering the diamond. (" e more free energy, the more likely it is that observers will emerge.") Different prior probabilities of vacuum types yield different results. Entropy production can be approximated as the number of galaxies in the diamond.


Watcher

The ''watcher measure'' imagines the world line of an eternal "watcher" that passes through an infinite number of
Big Crunch The Big Crunch is a hypothetical scenario for the ultimate fate of the universe, in which the expansion of the universe eventually reverses and the universe recollapses, ultimately causing the cosmic scale factor to reach absolute zero, an eve ...
singularities.


Guth–Vanchurin paradox

In all "cutoff" schemes for an expanding infinite multiverse, a finite percentage of observers reach the cutoff during their lifetimes. Under most schemes, if a current observer is still alive five billion years from now, then the later stages of their life must somehow be "discounted" by a factor of around two compared to their current stages of life. For such an observer,
Bayes' theorem Bayes' theorem (alternatively Bayes' law or Bayes' rule, after Thomas Bayes) gives a mathematical rule for inverting Conditional probability, conditional probabilities, allowing one to find the probability of a cause given its effect. For exampl ...
may appear to break down over this timescale due to
anthropic Anthropic PBC is an American artificial intelligence (AI) startup company founded in 2021. Anthropic has developed a family of large language models (LLMs) named Claude as a competitor to OpenAI's ChatGPT and Google's Gemini. According to the ...
selection effects; this hypothetical breakdown is sometimes called the "Guth–Vanchurin paradox". One proposed resolution to the paradox is to posit a physical "end of time" that has a fifty percent chance of occurring in the next few billion years. Another, overlapping, proposal is to posit that an observer no longer physically exists when it passes outside a given causal patch, similar to models where a particle is destroyed or ceases to exist when it falls through a black hole's event horizon. Guth and Vanchurin have pushed back on such "end of time" proposals, stating that while "(later) stages of my life will contribute (less) to multiversal averages" than earlier stages, this paradox need not be interpreted as a physical "end of time". The literature proposes at least five possible resolutions:Guth, Alan H., and Vitaly Vanchurin. "Eternal Inflation, Global Time Cutoff Measures, and a Probability Paradox." arXiv preprint arXiv:1108.0665 (2011). # Accept a physical "end of time" # Reject that probabilities in a finite universe are given by relative frequencies of events or histories # Reject calculating probabilities via a geometric cutoff # Reject standard probability theories, and instead posit that "relative probability" is, axiomatically, the limit of a certain geometric cutoff process # Reject eternal inflation Guth and Vanchurin hypothesize that standard probability theories might be incorrect, which would have counterintuitive consequences.


See also

*
Anthropic principle In cosmology, the anthropic principle, also known as the observation selection effect, is the proposition that the range of possible observations that could be made about the universe is limited by the fact that observations are only possible in ...


References

{{Reflist Inflation (cosmology) Multiverse Physical cosmology Unsolved problems in physics