On 06 Oct 2015, at 00:54, Brent Meeker wrote:



On 10/5/2015 3:05 PM, Bruce Kellett wrote:
On 6/10/2015 12:29 am, Stathis Papaioannou wrote:
On 5 October 2015 at 22:10, Bruce Kellett <bhkell...@optusnet.com.au> wrote:
On 5/10/2015 7:46 pm, Stathis Papaioannou wrote:
On 5 October 2015 at 14:09, Bruce Kellett <bhkell...@optusnet.com.au > wrote:
On 5/10/2015 1:53 pm, Stathis Papaioannou wrote:
On 5 Oct 2015, at 12:11 PM, Bruce Kellett <bhkell...@optusnet.com.au > wrote:

So psychological continuation is very dependent on the exact details of the case, and if copying of consciousness ever becomes possible, then, by and large, it will simply be regarded as another way of creating new people -- it will not be a recipe for immortality in any except a very impoverished sense.

The example I was thinking of was destructive copying. This is equivalent to being knocked out and carried to another place. It doesn't matter how far or if there is a time delay, since you don't experience this. It also doesn't matter if there is a causal connection, as there would be in teleportation or being knocked out and carried, or if the copying occurs randomly. There is no way for you to know from introspection how you have come to wake up in a new place.
You might not then be able to tell by introspection whether you had been destructively copied to a new location or simply carried there while unconscious.

That is exactly the point I have been trying to make. Further, you can't tell from introspection if you have been copied through exhaustive enumeration of all possible brain states 10^100 metres away or simply carried to the next room while unconscious.
I disagree. You certainly can tell that you have not been created by chance 10^100 metres away because that would involve the transfer of information over that distance, and that is not feasible in the time spans we are considering.

We have gone over this before: no transfer of information is needed if the copy is made by trying every possible configuration of atoms. Transfer of information is needed to verify that there is a copy and an original, but this makes no difference to the experience of the copy.
I think there is more involved than this. It is not a matter of experience. If a purported copy of you arises at some remote time or place by random chance, then there is no connection with your physical existence or consciousness. A clearer situation arises if you argue that there are an infinite number copies of you in the infinite type I multiverse. Each of these 'copies' is a complete conscious person with his/er own personal history. If you, here, die suddenly, then in so far as the copies are identical, they will also die. And if they are only approximately equivalent (so that they don't die), then they are not going to take too kindly to your turning up suddenly and trying to usurp their brain, body and consciousness!

These duplication scenarios make sense only if the remote person is connected with you by transfer of information -- such as by send your complete brain/body scan to the remote location and then re- assembling the copy there. That might resurrect you, or make a viable copy, but no chance configuration, or already existing remote person, is able to fulfill the scenario you wish to paint.

That is why one needs independent external evidence to be sure about what is going on.

To be sure what is going on, yes, independent external evidence is needed. But it is not needed in order to be conscious.
But it is needed in order to be sure of what person you are. Self deception is the most common form of deception, and only a fool would rely solely on introspection for any important question.

Information about any other copies is interesting but it has no bearing on your sense of continuity of consciousness.
If personal copying is essentially unavailable in your experience, then you might believe that you had simply been carried somewhere while unconscious. If personal duplication were commonplace in your experience, you would require more evidence to tell what might have happened. In either case, introspection is a poor guide to the nature of reality.

But introspection is an excellent guide to your thoughts and feelings.

But I am more than my thoughts and feelings. They are a poor guide to my identity. they change far too rapidly and chaotically to be a reliable guide to anything.

Your identity is only your thoughts and feelings. Your brain, body and the environment are only relevant because they affect your thoughts and feelings. If you were uploaded to a simulation that preserved your thoughts and feelings you would not notice that anything unusual has happened, even though the substrate in which your mind resides has changed radically.
This, of course, is the heart of our disagreement. Your identity is a lot more than your thoughts and feelings because those thoughts and feelings only have meaning in a context. And it is your physical body and immediate surrounds that provide that context. You might be right about uploading if the uploading is into an environment that is not too dissimilar from the current context of your body. However, if your sensory inputs change to any marked extent, you will certainly be aware that something has happened. If the changes are too drastic and inexplicable, you will quite probably go mad.

I agree that the hardware of your brain, per se, is not important. But only if one form of hardware simulates the other, essentially exactly. And if the wider environment is largely reproduced. Consciousness supervenes on the physical brain, and replacing the hardware does not alter this fact -- your consciousness still supervenes on the physical substrate. It is not independent of it as you wish to maintain.

It's not clear to me who is arguing for what. Stathis may think that consciousness is independent of it's physical substrate, but I don't see that he's arguing that here. He's arguing that there can be more that one instance of "the same" consciousness.

Yes, like for example when you are duplicated at W and M, but still in the box, before opening the box. There are two instantiation of the same consciousness. From the 1p view, the person can be consistently said to be unique, at both place at once. Once she opens the door, she get the bit of information which differentiate her from her doppelganger.

The same can be said if by chance, or laws, an exact copy of yourself appears out of your light cone. from the first person experience you will differentiate. This does not violate relativity, no more than making you unconscious, or making your brain working much slowly could make you feel travelling at a speed higher than light, but that is only a first person "illusion". Likewise with a sudden reconstitution of you which would be very far, and actually, that is the case with all your reconsistituion in the sigma_1 arithmetical reality (which explains the problem of matter confronting computationalists)


But it's not clear what is meant by "the same". Does one think of one's own consciousness as being the same as it was a second ago?

No, the same means exactly the same, like if two computers run the same (relevant) program.

You have a sort of choice to dubble or not the consciousness. It is unique from the 1p view, but it can be, and is, necessarily, incarnated in many different computations, executed by different universal machine/number.



an hour? a year? twenty years? I think there must be degrees of "sameness".

My consciousness one second before belongs already to a myth, it is a souvenir. But the sameness bears more on the notion of person. I am the one who collect my souvenirs and interpret it in term of possible futures or accessible states. "Who am I" is a too much personal question that it is almost impolite to ask in the comp frame.




Similarly, the degree will depend on the environmental context and interaction. If you became completely immobilized I think it would change your consciousness. Stephen Hawking is quite different than he was 50yrs ago. If you had a chip implanted that allowed you perceive the whole EM spectrum, including polarization, it might well change your consciousness. Drugs and accidents change people's personality and so, by inference, their consciousness. So does just plain learning.

I agree with what you say.






Is the question really about "Can we achieve immortality by copying to different substrates?" As Bruce points out we would only preserve our "self" exactly up to the last copy event, since we would have diverged from there. It's like making a backup on your computer, it doesn't mean that nothing's lost when it crashes.

OK.
That does not mean that something is lost, either.

Immortality quest is a waste of time, because once immortal, there is no more fun than to be mortal again.

It is vain, from computationalism "well understood", as we are already, although not in a believable or justifiable way. (So I am close to a G* minus G blaspheme here ...)

Bruno

http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to