Matt: I would prefer to analyse something simple such as the double slit experiment. If you do an experiment to see which slit the photon goes through you get an accumulation of photons in equal numbers behind each slit. If you don't make an effort to see which slit the photons go through, you get an interference pattern. What, if this is all a simulation, is requiring the simulation to behave this way? I assume that this is a forced result based on the assumption of using only as much computation as needed to perform the simulation. A radioactive atom decays when it decays. All we can say with any certainty is what it's probability distribution in time is for decay. Why is that? Why would a simulation not maintain local causality (EPR paradox)? I think it would be far more interesting (and meaningful) if the simulation hypothesis could provide a basis for these observations.
Eric B. Ramsay Matt Mahoney <[EMAIL PROTECTED]> wrote: --- "Eric B. Ramsay" wrote: > Apart from all this philosophy (non-ending as it seems), Table 1. of the > paper referred to at the start of this thread gives several consequences of > a simulation that offer to explain what's behind current physical > observations such as the upper speed limit of light, relativistic and > quantum effects etc. Without worrying about whether we are a simulation of a > sinmulation of a simulation etc, it would be interesting to work out all the > qualitative/quantitative (?) implications of the idea and see if > observations strongly or weakly support it. If the only thing we can do with > the idea is discuss phiosophy then the idea is useless. There is plenty of physical evidence that the universe is simulated by a finite state machine or a Turing machine. 1. The universe has finite size, mass, and age, and resolution. Taken together, the universe has a finite state, expressible in approximately hG/c^5T^2 = 1.55 x 10^122 bits ~ 2^406 bits (where h is Planck's constant, G is the gravitational constant, c is the speed of light, and T is the age of the universe. By coincidence, if the universe is divided into 2^406 regions, each is the size of a proton or neutron. This is a coincidence because h, G, c, and T don't depend on the properties of any particles). 2. A finite state machine cannot model itself deterministically. This is consistent with the probabilistic nature of quantum mechanics. 3. The observation that Occam's Razor works in practice is consistent with the AIXI model of a computable environment. 4. The complexity of the universe is consistent with the simplest possible algorithm: enumerate all Turing machines until a universe supporting intelligent life is found. The fastest way to execute this algorithm is to run each of the 2^n universes with complexity n bits for 2^n steps. The complexity of the free parameters in many string theories plus general relativity is a few hundred bits (maybe 406). -- Matt Mahoney, [EMAIL PROTECTED] ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?& ----- This list is sponsored by AGIRI: http://www.agiri.org/email To unsubscribe or change your options, please go to: http://v2.listbox.com/member/?member_id=4007604&id_secret=85331123-9c8ee9
