On 18/02/2008, Richard Loosemore <[EMAIL PROTECTED]> wrote:

> The last statement you make, though, is not quite correct:  with a
> jumbled up sequence of "episodes" during which the various machines were
> running the brain code, he whole would lose its coherence, because input
> from the world would now be randomised.
>
> If the computer was being fed input from a virtual reality simulation,
> that would be fine.  It would sense a sudden change from real world to
> virtual world.

The argument that is the subject of this thread wouldn't work if the
brain simulation had to interact with the world at the level of the
substrate it is being simulated on. However, it does work if you
consider an inputless virtual environment with conscious inhabitants.
Suppose you are now living in such a simulation. From your point of
view, today is Monday and yesterday was Sunday. Do you have any
evidence to support the belief that Sunday was was actually run
yesterday in the real world, or that it was run at all? The simulation
could have been started up one second ago, complete with false
memories of Sunday. Sunday may not actually be run until next year,
and the version of you then will have no idea that the future has
already happened.

> But again, none of this touches upon Lanier's attempt to draw a bogus
> conclusion from his thought experiment.
>
>
> > No external observer would ever be able to keep track of such a
> > fragmented computation and as far as the rest of the universe is
> > concerned there may as well be no computation.
>
> This makes little sense, surely.  You mean that we would not be able to
> interact with it?  Of course not:  the poor thing will have been
> isolated from meanigful contact with the world because of the jumbled up
> implementation that you posit.  Again, though, I see no relevant
> conclusion emerging from this.
>
> I cannot make any sense of your statement that "as far as the rest of
> the universe is concerned there may as well be no computation."  So we
> cannot communicate with it anymore.... that should not be surprising,
> given your assumptions.

We can't communicate with it so it is useless as far as what we
normally think of as computation goes. A rainstorm contains patterns
isomorphic with an abacus adding 127 and 498 to give 625, but to
extract this meaning you have to already know the question and the
answer, using another computer such as your brain. However, in the
case of an inputless simulation with conscious inhabitants this
objection is irrelevant, since the meaning is created by observers
intrinsic to the computation.

Thus if there is any way a physical system could be interpreted as
implementing a conscious computation, it is implementing the conscious
computation, even if no-one else is around to keep track of it.



-- 
Stathis Papaioannou

-------------------------------------------
singularity
Archives: http://www.listbox.com/member/archive/11983/=now
RSS Feed: http://www.listbox.com/member/archive/rss/11983/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=4007604&id_secret=96140713-a54b2b
Powered by Listbox: http://www.listbox.com

Reply via email to