How do you get the "50% chance"? There is a 100%
chance of a mind waking up who has been uploaded, and
also a 100% chance of a mind waking up who hasn't.
This doesn't violate the laws of probability because
these aren't mutually exclusive. Asking which one "was
you" is silly, because we're assuming they're
completely identical at the instant they wake up;
they're both you. This is apparently something that
needs a lot of explaining; consciousness is not a
conserved quantity. If there was some magicky
consciousness "stuff" that either was uploaded or not
uploaded, then sure, you could talk about a 50%
probability of the upload being successful. But if we
define "successful" as "you wake up uploaded", and
"failure" as "you wake up not uploaded", then there is
a 100% (assuming the process always works technically)
probability of success and a 100% probability of
failure. Both possibilities refer to the *same
process*, to the exact same series of atoms getting
juggled back and forth. I recognize that this is
really confusing, but it seems to match what would
happen if you actually tried it.

Now, if you consider the question from the standpoint
of utility, it becomes even more confusing because
desirableness isn't a conserved quantity. Suppose, for
the sake of argument, that each sentient being is
given a hundred chips, which they can put toward
making the outcome of any event more or less probable.
Chips are conserved for any one person, and any chip
is exchangeable for any other chip, but *effects*
aren't conserved; a single chip will affect some
events a lot more than others. Suppose that a guy,
Fred, wants to get uploaded. He puts 50 chips towards
getting uploaded, as uploading is really nifty and can
be used to do all kinds of things because chips are
worth a lot more in VR due to the easiness of
manipulating events. The FAI looks him over, scans his
brain-state, extrapolates his volition, concludes that
the vast majority of the possible future Freds are
happy, that his desire is genuine, etc. and gives the
tech-robot the OK to proceed. Fred is sedated, the
brain-state is uploaded, and a new VR being with
Fred's memories is formed. When this being, Fred*,
wakes up, he is recognized as sentient and *also*
given 100 chips, initially allocated according to
Fred's preferences. When Fred* wakes up, he
deallocates the 50 chips as that goal has been
fulfilled, and starts to allocate them towards
whatever he thinks is fun in VR. But when Fred wakes
up, his goal is still unfulfilled, and so he quite
stubbornly keeps his 50 chips allocated on "upload",
claiming that he didn't get what the FAI promised.
Fred*, seeing Fred over the monitor, feels sorry for
him and also allocates 10 chips towards getting Fred
uploaded. The normal thing after a goal has been
fulfilled is for the chips to be deallocated- they are
valuable, after all, and people need to move on to
other things. This prevents people from making the FAI
bang its resources eternally against something simple
like wanting a banana long after the original desire
was fulfilled. (Note that simple banana-tiling
machines are special cases; I'm talking about ordinary
sentient beings who just happen to want a single
banana.) But in this instance, the number of chips
allocated to the goal is even greater than it was
before, and so the FAI is programmed to put even more
resources behind getting Fred uploaded. By induction,
Fred will never be uploaded, even though the pressure
to get him uploaded will keep increasing from
successive generations of uploaded Freds. This is
something of a paradoxical situation akin to quark
confinement- the more you try and fulfill the goal of
"uploading Fred", the more desirable it becomes to
fulfill the goal, because the copy-upload process
actually *creates new goals* as part of its mechanism.
This kind of thing, both here and if it turns up
anywhere else, is Very Bad for any future society with
finite resources, because all those resources will
wind up getting sucked into the positive-feedback
cycle.

 - Tom

--- Randall Randall <[EMAIL PROTECTED]>
wrote:

> 
> On Jun 28, 2007, at 12:56 PM, Charles D Hixson
> wrote:
> 
> > Stathis Papaioannou wrote:
> >> Yes, you would live on in one of the copies as if
> uploaded, and yes
> >> the selection of which copy would be purely
> random, dependent on the
> >> relative frequency of each copy (you can still
> define a measure to
> >> derive probabilities even though we are talking
> infinite subsets of
> >> infinite sets). What do you think would happen?
> > Why in only one of the copies? This is the part of
> the argument  
> > that I don't understand.  I accept that over time
> the copies would  
> > diverge, but originally they would be
> substantially the same, so  
> > why claim that the original consciousness would
> only be present in  
> > one of them?
> >
> 
> If you ask either copy afterward if they ended up
> experience both existences or just one, they'll
> say just one.  Since there's two possibilities,
> and since no single individual will experience
> both at once, there's a 50% chance.
> 
> Someone on the outside could well insist that
> the original person is experiencing both at once
> (and many people do insist that that will be the
> case), but you won't be able to find a person to
> talk to who is experiencing both at once at any
> given time.
> 
> --
> Randall Randall <[EMAIL PROTECTED]>
> "Someone needs to invent a Bayesball bat that exists
> solely for
>   smacking people [...] upside the head." --
> Psy-Kosh on reddit.com
> 
> 
> -----
> This list is sponsored by AGIRI:
> http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
>
http://v2.listbox.com/member/?&;
> 



      
____________________________________________________________________________________
Park yourself in front of a world of choices in alternative vehicles. Visit the 
Yahoo! Auto Green Center.
http://autos.yahoo.com/green_center/ 

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=7d7fb4d8

Reply via email to