Jump to content

Digital physics

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 142.232.8.8 (talk) at 16:38, 3 May 2008 (spelling error). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Digital physics holds the basic premise that the entire history of our universe is computable, that is, the output of a (presumably short) computer program.

In more detail, it involves one or more of the following hypotheses. That the universe or reality is:

Pancomputationalism, computational universe theory, John Archibald Wheeler's "It from bit", and Max Tegmark's Ultimate ensemble are examples of related ideas.

History

The hypothesis was pioneered in Konrad Zuse's book Rechnender Raum (translated by MIT into English as Calculating Space, 1970). Its proponents include Edward Fredkin,[1] Stephen Wolfram,[2][3] Juergen Schmidhuber,[4] and Nobel laureate Gerard 't Hooft.[5] They hold that the apparently probabilistic nature of quantum physics is not necessarily incompatible with the notion of computability. Quantum versions of digital physics have recently been proposed by Seth Lloyd[6], David Deutsch, and Paola Zizzi[7].

Digital physics

Overview

The theory of digital physics is that there exists a program for a universal computer which computes the dynamic evolution of our world. For example, the computer could be a huge cellular automaton, as suggested by Zuse (1967), or a universal Turing machine, as suggested by Schmidhuber (1997), who pointed out that there is a very short program that computes all possible computable universes in an asymptotically optimal way.

Some try to identify single physical particles with simple bits. For example, if one particle, such as an electron, is switching from one quantum state to another, it may be the same as if a bit is changed from one value (0) to another (1). There is nothing more required to describe a single quantum switch of a given particle than a single bit. And as the world is built up of the basic particles and their behavior can be completely described by the quantum switches they perform that also means that the world as a whole can be described by bits. Every state is information and every change is a change in information (one or a number of bit manipulations ). The known universe could, as a conclusion, be simulated by a computer capable of saving about 1090 bits and manipulating them, and could very well be a simulation. Should this be the case, then hypercomputation would be impossible.

Loop quantum gravity could lend support to digital physics, in that it assumes space-time to be quantized.

It from bit

Physicist John Archibald Wheeler wrote "it is not unreasonable to imagine that information sits at the core of physics, just as it sits at the core of a computer". David Chalmers of the Australian National University summarised his views as:

"Wheeler (1990) has suggested that information is fundamental to the physics of the universe. According to this "it from bit" doctrine, the laws of physics can be cast in terms of information, postulating different states that give rise to different effects without actually saying what those states are. It is only their position in an information space that counts. If so, then information is a natural candidate to also play a role in a fundamental theory of consciousness. We are led to a conception of the world on which information is truly fundamental, and on which it has two basic aspects, corresponding to the physical and the phenomenal features of the world".

[8]

Digital vs. Informational physics

Not every informational approach to physics (or ontology) is necessarily digital. According to Luciano Floridi, informational structural realism [1] is a version of structural realism that supports the ontological commitment to a view of the world as the totality of informational objects dynamically interacting with each other. Such informational objects are understood as constraining affordances. Digital ontology and pancomputationalism are also independent positions. Famously, Wheeler supported the former but not (or at least not explicitly) the latter. As he wrote: “It from bit. Otherwise put, every ‘it’ – every particle, every field of force, even the space-time continuum itself – derives its function, its meaning, its very existence entirely – even if in some contexts indirectly – from the apparatus-elicited answers to yes-or-no questions, binary choices, bits. ‘It from bit’ symbolizes the idea that every item of the physical world has at bottom – a very deep bottom, in most instances – an immaterial source and explanation; that which we call reality arises in the last analysis from the posing of yes–no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and that this is a participatory universe” (John Archibald Wheeler [1990], 5). On the other hand, pancomputationalists like Lloyd [2006], who describes the universe not as a digital but as a quantum computer, can still hold an analogue or hybrid ontology. And informational ontologists like Sayre and Floridi do not have to embrace either a digital ontology or a pancomputationalist position. (src: Floridi talk on Informational Nature of Reality, abstract at the E-CAP conference 2006)


Computational foundations

Turing machines

Theoretical computer science is founded upon the concept of a Turing machine, a hypothetical computer described by Alan Turing in 1936. Although they are mechanically simple, it turns out, as stated in the Church-Turing thesis, that Turing machines are powerful enough to solve any "reasonable" problem. (For theoretical computer scientists, "power" is the ability to solve problems at all rather than solving them quickly). A Turing machine therefore sets the practical "ceiling" on computational power, apart from the hypothetical possibilities presented by hypercomputers.

The principle of computational equivalence, as Stephen Wolfram calls it, is a powerful motivation behind the digital approach. If correct, it means that everything can be computed by the same machine, and by an essentially simple machine, thus fulfilling the traditional requirement in physics to find simple underlying laws and mechanisms.

Digital physics is falsifiable: a less powerful class of computers cannot simulate a more powerful class. Therefore, if our universe is being simulated, a computer at least as powerful as a Turing machine is being used. If we find or build a hypercomputer, on the other hand, we cannot be simulated by a Turing machine.

The Church-Turing (Deutsch) thesis

The modest version of the Church-Turing thesis claims that any computer as powerful as a Turing machine can calculate anything a human can calculate, given enough time. A stronger version claims that a Universal Turing machine can compute anything whatsoever, i.e., it is not possible to build a hypercomputer, a super-Turing computer. But the limits of practical computation are imposed by physics, not by theoretical computer science:

"Turing did not show that his machines can solve any problem that can be solved "by instructions, explicitly stated rules, or procedures", nor did he prove that the universal Turing machine "can compute any function that any computer, with any architecture, can compute". He proved that his universal machine can compute any function that any Turing machine can compute; and he put forward, and advanced philosophical arguments in support of, the thesis here called Turing's thesis. But a thesis concerning the extent of effective methods -- which is to say, concerning the extent of procedures of a certain sort that a human being unaided by machinery is capable of carrying out -- carries no implication concerning the extent of the procedures that machines are capable of carrying out, even machines acting in accordance with ‘explicitly stated rules’. For among a machine's repertoire of atomic operations there may be those that no human being unaided by machinery can perform." [9]

On the other hand, if two further conjectures are made, along the lines that:

  1. that hypercomputation always involves actual infinities
  2. that there are no actual infinities in physics

...the resulting compound principle does bring practical computation within Turing's limits.

As David Deutsch expresses it:

I can now state the physical version of the Church-Turing principle: "Every finitely realizable physical system can be perfectly simulated by a universal model computing machine operating by finite means." This formulation is both better defined and more physical than Turing's own way of expressing it.[10] (Emphasis added)

This compound conjecture is sometimes called the strong Church-Turing thesis, or the Church–Turing–Deutsch principle.

Criticism

The critics — including a majority of professionals who work with quantum mechanics — argue against digital physics in a number of ways.

Continuous Symmetries

One objection is that the models of digital physics are incompatible with the existence of continuous symmetries such as rotational symmetry, translational symmetry, Lorentz symmetry, electroweak symmetry, and many others. Proponents of digital physics, however, reject the very notion of the continuum, and claim that the existing continuous theories are just approximations of a true discrete theory (the Planck length, for example, as a minimum meaningful unit of distance, suggests that space is at some level quantized).

Locality

Some argue that the models of digital physics violate various postulates of quantum physics. For example, if these models are not based on Hilbert spaces and probabilities, they belong to the class of theories with local hidden variables that some think have been ruled out experimentally using Bell's theorem. This criticism has two possible answers. First, any notion of locality in the 'digital' model doesn't necessarily have to correspond to locality formulated in the usual way in the emergent spacetime. A concrete example of this case was recently given by Lee Smolin.[11] Another possibility is a well known loophole in Bell's theorem, known as superdeterminism (sometimes referred to as predeterminism).[12] In a completely deterministic model, the experimenter's decision to measure certain components of the spins is predetermined. Thus, the assumption that the experimenter could have decided to measure different components of the spins than he actually did is, strictly speaking, not true.

Real numbers

It can be argued that any physical theory involving real numbers (and all major theories do, at the time of writing) poses problems. Known physics is held to be computable, but that statement needs to be qualified in various ways. A number — thinking particularly of a real number, one with an infinite number of digits -- is said to be computable if a Turing machine will continue to spit out digits endlessly. In other words, there is no question of getting to the "last digit". But this sits uncomfortably with the idea of simulating physics in real time (or any plausible kind of time). Known physical laws (including those of quantum mechanics) are very much infused with real numbers and continua.

"So ordinary computational descriptions do not have a cardinality of states and state space trajectories that is sufficient for them to map onto ordinary mathematical descriptions of natural systems. Thus, from the point of view of strict mathematical description, the thesis that everything is a computing system in this second sense cannot be supported".[13]

Moreover, the universe seems to be able decide on their values on a moment-by-moment basis. As Richard Feynman put it:

"It always bothers me that, according to the laws as we understand them today, it takes a computing machine an infinite number of logical operations to figure out what goes on in no matter how tiny a region of space, and no matter how tiny a region of time. How can all that be going on in that tiny space? Why should it take an infinite amount of logic to figure out what one tiny piece of space/time is going to do?[14]

However, he went on to say:

So I have often made the hypotheses that ultimately physics will not require a mathematical statement, that in the end the machinery will be revealed, and the laws will turn out to be simple, like the chequer board with all its apparent complexities. But this speculation is of the same nature as those other people make – ‘I like it’, ‘I don’t like it’, – and it is not good to be prejudiced about these things".[14]

Computation and mechanism

It can also be argued that only certain fairly specific systems are computers, so the universe as a whole cannot be a computer. For instance, Gualtiero Piccinini (who introduced the term 'pancomputationalism' in his Ph.D. dissertation)[2] argues[13] that out of the various ways of defining a computer, the ones that are sufficiently rich and specific to make computational theory of mind a substantive theory, are too specific to apply to any system whatsoever.

Continuous alternatives

In light of the above criticisms, an alternative is to determine in continuous automata, such as an Einstein vacuum spacetime, whether phenomena analogous to gliders and glider guns exist. It has been shown [citation needed] that the timelike topological feature associated with any closed timelike curve (CTC) propagates, in a manner similar to a glider. However, a glider gun requires topological change, which implies under certain assumptions the creation of a singularity by a theorem of Tipler. However, this theorem does not apply to spacetimes with a CTC through every point.

See also

References

  1. ^ Fredkin, Edward, "Digital Mechanics", Physica D, (1990) 254-270 North-Holland.
  2. ^ Wolfram's New Kind of Science web site
  3. ^ ,reviews of Wolframs New Kind of science
  4. ^ Schmidhuber, J. Computer Universes and an Algorithmic Theory of Everything
  5. ^ G. 't Hooft, Quantum Gravity as a Dissipative Deterministic System, Class. Quant. Grav. 16, 3263-3279 (1999) preprint.
  6. ^ S. Lloyd, The Computational Universe: Quantum gravity from quantum computation, preprint.
  7. ^ Zizzi, P. - Spacetime at the Planck Scale: The Quantum Computer View - arXiv:gr-qc/0304032
  8. ^ Chalmers, D. Facing up to the Hard Problem of Consciousness referring to Wheeler, J.A. 1990. Information, physics, quantum: The search for links. In (W. Zurek, ed.) Complexity, Entropy, and the Physics of Information. Redwood City, CA: Addison-Wesley.
  9. ^ Stanford Encyclopedia of Philosophy on the Church-Turing thesis
  10. ^ Deutsch, D. ‘Quantum Theory, the Church-Turing Principle and the Universal Quantum Computer’
  11. ^ L. Smolin, Matrix models as non-local hidden variables theories, preprint.
  12. ^ J. S. Bell, Bertlmann's socks and the nature of reality, Journal de Physique 42, C2 41-61 (1981).
  13. ^ a b Piccinini, Gualtiero (2007). "Computational Modelling vs. Computational Explanation: Is Everything a Turing Machine, and Does It Matter to the Philosophy of Mind?". Australasian Journal of Philosophy. 85 (1): 93–115.
  14. ^ a b Feynman, R. "The Character of Physical Law" p. 57.