On 09/09/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:

> Your dilemma: after you upload, does the original human them become a
> p-zombie, or are there two copies of your consciousness?  Is it necessary to
> kill the human body for your consciousness to transfer?

I have the same problem in ordinary life, since the matter in my brain
from a year ago has almost all dispersed into the biosphere. Even the
configuration matter in my current brain, and the information it
represents, only approximates that of my erstwhile self. It's just
convenient that my past selves naturally disintegrate, so that I don't
encounter them and fight it out to see which is the "real" me. We've
all been through the equivalent of destructive uploading.

> What if the copy is not exact, but close enough to fool others who know you?
> Maybe you won't have a choice.  Suppose you die before we have developed the
> technology to scan neurons, so family members customize an AGI in your
> likeness based on all of your writing, photos, and interviews with people that
> knew you.  All it takes is 10^9 bits of information about you to pass a Turing
> test.  As we move into the age of surveillance, this will get easier to do.  I
> bet Yahoo knows an awful lot about me from the thousands of emails I have sent
> through their servers.

There is no guarantee that something which behaves the same way as the
original also has the same consciousness. However, there are good
arguments in support of the thesis that something which behaves the
same way as the original as a result of identical or isomorphic brain
structure also has the same consciousness as the original.

("Same" in this context does not mean "one and the same", any more
than I am one and the same as my past selves.)





-- 
Stathis Papaioannou

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&id_secret=39896278-cab09e

Reply via email to