The characterization of the universe as finely tuned suggests that the occurrence of life in the Universe is very sensitive to the values of certain fundamental physical constants and that the observed values are, for some reason, improbable. If the values of any of certain free parameters in contemporary physical theories had differed only slightly from those observed, the evolution of the Universe would have proceeded very differently and life as it is understood may not have been possible.
Various explanations of this ostensible fine-tuning have been proposed. However, the belief that the observed values require explanation depends on assumptions about what values are probable or “natural” in some sense. Alternatively, the anthropic principle may be understood to render the observed values tautological and not in need of explanation.
In 1913, the chemist Lawrence Joseph Henderson (1878–1942) wrote The Fitness of the Environment, one of the first books to explore concepts of fine tuning in the universe. Henderson discusses the importance of water and the environment with respect to living things, pointing out that life depends entirely on the very specific environmental conditions on Earth, especially with regard to the prevalence and properties of water.
In 1961, physicist Robert H. Dicke claimed that certain forces in physics, such as gravity and electromagnetism, must be perfectly fine-tuned for life to exist anywhere in the universe. Fred Hoyle also argued for a fine-tuned universe in his 1984 book The Intelligent Universe. “The list of anthropic properties, apparent accidents of a non-biological nature without which carbon-based and hence human life could not exist, is large and impressive.”
Belief in the fine-tuned universe led to the expectation that the Large Hadron Collider would produce evidence of physics beyond the standard model. However, by 2012 results from the LHC had ruled out the class of supersymmetric theories that may have explained the fine-tuning.
The premise of the fine-tuned universe assertion is that a small change in several of the physical constants would make the universe radically different. As Stephen Hawking has noted,
“The laws of science, as we know them at present, contain many fundamental numbers, like the size of the electric charge of the electron and the ratio of the masses of the proton and the electron. … The remarkable fact is that the values of these numbers seem to have been very finely adjusted to make possible the development of life.”
If, for example, the strong nuclear force were 2% stronger than it is (i.e. if the coupling constant representing its strength were 2% larger), while the other constants were left unchanged, diprotons would be stable; according to physicist Paul Davies, hydrogen would fuse into them instead of deuterium and helium. This would drastically alter the physics of stars, and presumably preclude the existence of life similar to what we observe on Earth. The existence of the diproton would short-circuit the slow fusion of hydrogen into deuterium. Hydrogen would fuse so easily that it is likely that all of the universe’s hydrogen would be consumed in the first few minutes after the Big Bang. This “diproton argument” is disputed by other physicists, who calculate that as long as the increase in strength is less than 50%, stellar fusion could occur despite the existence of stable diprotons.
The precise formulation of the idea is made difficult by the fact that physicists do not yet know how many independent physical constants there are. The current standard model of particle physics has 25 freely adjustable parameters and general relativity has one additional parameter, the cosmological constant, which is known to be non-zero, but profoundly small in value. However, because physicists have not developed an empirically successful theory of quantum gravity, there is no known way to combine quantum mechanics, on which the standard model depends, and general relativity. Without knowledge of this more complete theory that is suspected to underlie the standard model, definitively counting the number of truly independent physical constants is not possible. In some candidate theories, the number of independent physical constants may be as small as one. For example, the cosmological constant may be a fundamental constant, but attempts have also been made to calculate it from other constants, and according to the author of one such calculation, “the small value of the cosmological constant is telling us that a remarkably precise and totally unexpected relation exists among all the parameters of the Standard Model of particle physics, the bare cosmological constant and unknown physics.”
Martin Rees formulates the fine-tuning of the universe in terms of the following six dimensionless physical constants.
- N, the ratio of the strength of electromagnetism to the strength of gravity for a pair of protons, is approximately 1036. According to Rees, if it were significantly smaller, only a small and short-lived universe could exist.
- Epsilon (ε), a measure of the nuclear efficiency of fusion from hydrogen to helium, is 0.007: when four nucleons fuse into helium, 0.007 (0.7%) of their mass is converted to energy. The value of ε is in part determined by the strength of the strong nuclear force. If ε were 0.006, only hydrogen could exist, and complex chemistry would be impossible. According to Rees, if it were above 0.008, no hydrogen would exist, as all the hydrogen would have been fused shortly after the Big Bang. Other physicists disagree, calculating that substantial hydrogen remains as long as the strong force coupling constant increases by less than about 50%.
- Omega (Ω), commonly known as the density parameter, is the relative importance of gravity and expansion energy in the universe. It is the ratio of the mass density of the universe to the “critical density” and is approximately 1. If gravity were too strong compared with dark energy and the initial metric expansion, the universe would have collapsed before life could have evolved. On the other side, if gravity were too weak, no stars would have formed.
- Lambda (Λ), commonly known as the cosmological constant, describes the ratio of the density of dark energy to the critical energy density of the universe, given certain reasonable assumptions such as positing that dark energy density is a constant. In terms of Planck units, and as a natural dimensionless value, the cosmological constant, Λ, is on the order of 10−122. This is so small that it has no significant effect on cosmic structures that are smaller than a billion light-years across. If the cosmological constant were not extremely small, stars and other astronomical structures would not be able to form.
- Q, the ratio of the gravitational energy required to pull a large galaxy apart to the energy equivalent of its mass, is around 10−5. If it is too small, no stars can form. If it is too large, no stars can survive because the universe is too violent, according to Rees.
- D, the number of spatial dimensions in spacetime, is 3. Rees claims that life could not exist if there were 2 or 4 dimensions of spacetime nor if any other than 1 time dimension existed in spacetime. However, contends Rees, this does not preclude the existence of ten-dimensional strings.
Carbon and oxygen
Further information: Triple-alpha process § Improbability and fine-tuning
An older example is the Hoyle state, the third-lowest energy state of the carbon-12 nucleus, with an energy of 7.656 MeV above the ground level. According to one calculation, if the state’s energy level was lower than 7.3 or greater than 7.9 MeV, insufficient carbon would exist to support life. Furthermore, to explain the universe’s abundance of carbon, the Hoyle state must be further tuned to a value between 7.596 and 7.716 MeV. A similar calculation, focusing on the underlying fundamental constants that give rise to various energy levels, concludes that the strong force must be tuned to a precision of at least 0.5%, and the electromagnetic force to a precision of at least 4%, to prevent either carbon production or oxygen production from dropping significantly.
A slightly larger quantity of dark energy, or a slightly larger value of the cosmological constant would have caused space to expand rapidly enough that galaxies would not form.
The fine-tuned universe argument has been criticized as an argument by lack of imagination, as it assumes no other forms of life, sometimes referred to as carbon chauvinism. Conceptually, alternative biochemistry or other forms of life are possible. Regarding this, Stenger argued: “We have no reason to believe that our kind of carbon-based life is all that is possible. Furthermore, modern cosmology theorises that multiple universes may exist with different constants and laws of physics. So, it is not surprising that we live in the one suited for us. The universe is not fine-tuned to life; life is fine-tuned to the universe.”
There are fine tuning arguments that are naturalistic. First, as mentioned in premise section the fine tuning might be an illusion: we don’t know the true number of independent physical constants, which could be small and even reduce to one. And we don’t know either the laws of the “potential universe factory”, i.e. the range and statistical distribution ruling the “choice” for each constant (including our arbitrary choice of units and precise set of constants). Still, as modern cosmology developed various hypotheses not presuming hidden order have been proposed. One is an oscillatory universe or a multiverse, where fundamental physical constants are postulated to resolve themselves to random values in different iterations of reality. Under this hypothesis, separate parts of reality would have wildly different characteristics. In such scenarios, the appearance of fine-tuning is explained as a consequence of the weak anthropic principle and selection bias (specifically survivor bias) that only those universes with fundamental constants hospitable to life (such as the universe we observe) would have living beings emerge and evolve capable of contemplating the questions of origins and of fine-tuning. All other universes would go utterly unbeheld by any such beings.
Main article: Multiverse
The Multiverse hypothesis proposes the existence of many universes with different physical constants, some of which are hospitable to intelligent life (see multiverse: anthropic principle). Because we are intelligent beings, it is unsurprising that we find ourselves in a hospitable universe if there is such a multiverse. The Multiverse hypothesis is therefore thought to provide an elegant explanation of the finding that we exist despite the required fine-tuning. (See for a detailed discussion of the arguments for and against this suggested explanation.)
The multiverse idea has led to considerable research into the anthropic principle and has been of particular interest to particle physicists, because theories of everything do apparently generate large numbers of universes in which the physical constants vary widely. As yet, there is no evidence for the existence of a multiverse, but some versions of the theory do make predictions that some researchers studying M-theory and gravity leaks hope to see some evidence of soon. Some multiverse theories are not falsifiable, thus scientists may be reluctant to call any multiverse theory “scientific”. UNC-Chapel Hill professor Laura Mersini-Houghton claims that the WMAP cold spot may provide testable empirical evidence for a parallel universe, although this claim was recently refuted as the WMAP cold spot was found to be nothing more than a statistical artifact. Variants on this approach include Lee Smolin’s notion of cosmological natural selection, the Ekpyrotic universe, and the Bubble universe theory.
Critics of the multiverse-related explanations argue that there is no independent evidence that other universes exist. Some criticize the inference from fine-tuning for life to a multiverse as fallacious, whereas others defend it against that challenge.
Stephen Hawking, along with Thomas Hertog of CERN, proposed that the universe’s initial conditions consisted of a superposition of many possible initial conditions, only a small fraction of which contributed to the conditions we see today. According to their theory, it is inevitable that we find our universe’s “fine-tuned” physical constants, as the current universe “selects” only those past histories that led to the present conditions. In this way, top-down cosmology provides an anthropic explanation for why we find ourselves in a universe that allows matter and life, without invoking the ontic existence of the Multiverse.
One hypothesis is that the universe may have been designed by extra-universal aliens. Some believe this would solve the problem of how a designer or design team capable of fine-tuning the universe could come to exist. Cosmologist Alan Guth believes humans will in time be able to generate new universes. By implication previous intelligent entities may have generated our universe. This idea leads to the possibility that the extra-universal designer/designers are themselves the product of an evolutionary process in their own universe, which must therefore itself be able to sustain life. However it also raises the question of where that universe came from, leading to an infinite regress.
The Designer Universe theory of John Gribbin suggests that the universe could have been made deliberately by an advanced civilization in another part of the Multiverse, and that this civilization may have been responsible for causing the Big Bang.
While few scientists believe a supernatural explanation necessary, some individual scientists, theologians, and philosophers as well as certain religious groups argue that providence or creation are responsible for fine-tuning.
Christian philosopher Alvin Plantinga argues that random chance, applied to a single and sole universe, only raises the question as to why this universe could be so “lucky” as to have precise conditions that support life at least at some place (the Earth) and time (within millions of years of the present).
One reaction to these apparent enormous coincidences is to see them as substantiating the theistic claim that the universe has been created by a personal God and as offering the material for a properly restrained theistic argument—hence the fine-tuning argument. It’s as if there are a large number of dials that have to be tuned to within extremely narrow limits for life to be possible in our universe. It is extremely unlikely that this should happen by chance, but much more likely that this should happen, if there is such a person as God.— Alvin Plantinga, “The Dawkins Confusion: Naturalism ad absurdum“
This fine-tuning of the universe is cited by philosopher and Christian apologist William Lane Craig as an evidence for the existence of God or some form of intelligence capable of manipulating (or designing) the basic physics that governs the universe. Craig argues, however, “that the postulate of a divine Designer does not settle for us the religious question.”
Philosopher and theologian Richard Swinburne reaches the design conclusion using Bayesian probability.
Scientist and theologian Alister McGrath has pointed out that the fine-tuning of carbon is even responsible for nature’s ability to tune itself to any degree.
The entire biological evolutionary process depends upon the unusual chemistry of carbon, which allows it to bond to itself, as well as other elements, creating highly complex molecules that are stable over prevailing terrestrial temperatures, and are capable of conveying genetic information (especially DNA). […] Whereas it might be argued that nature creates its own fine-tuning, this can only be done if the primordial constituents of the universe are such that an evolutionary process can be initiated. The unique chemistry of carbon is the ultimate foundation of the capacity of nature to tune itself.
Theoretical physicist and Anglican priest John Polkinghorne has stated: “Anthropic fine tuning is too remarkable to be dismissed as just a happy accident.”
Adapted from Wikipedia, the free encyclopedia