On 2 February 2010 18:07, Jack Mallah <jackmal...@yahoo.com> wrote:

> --- On Wed, 1/27/10, Stathis Papaioannou <stath...@gmail.com> wrote:
>> if there were a million copies of me in lockstep and all but one were 
>> destroyed, then each of the million copies would feel that they had 
>> continuity of consciousness with the remaining one, so they are OK with what 
>> is about to happen.
>
> Suppose someone killed all copies but lied to them first, saying that they 
> would survive.  They would not feel worried.  Would that be OK?  It seems 
> like the same idea to me.

You're assuming that it is false to say each of the million copies
would survive if all but one were destroyed. You seem to think that it
is OK if the (destructive) copying is one->one  but not if it is
many->one. In coming to this conclusion it seems you are taking the
general moral position that it is good to increase the number of
people in the world and bad to decrease it. However, for the purposes
of this discussion what is at issue is the *purely selfish*
perspective, not what is good for other people or the rest of the
world. From the purely selfish perspective, it does seem to you and to
me to every other person that we survive if a future copy of us
exists, even though arguably this is just a delusion. And the delusion
of survival is perfectly maintained for each of the multiple copies
destroyed in a many->one replication, as perfectly as it is maintained
for a single copy destroyed in a one->many replication.

Here is a scenario to consider:

(1) A1 is destructively scanned, and from the scan a copy B1 is made.
A1 looks forward to surviving as B1, and B1 considers that he is the
same person as A1 and has survived. In a sense these beliefs may be
delusional, but A1 is happy and B1 is happy. After all, this is what
happens in ordinary life.

(2) As in (1), but in the next room there is another copy A2 in
lockstep with A1. A2 is destroyed when A1 is destroyed in the
destructive scanning. Now, does the presence of A2 diminish A1's
expectation that he will survive in B1, making the delusion somehow
less compelling or more delusional? I don't see by what process that
could possibly happen. A1 still feels he will survive in B1 so A1 is
happy; and if he learns at some point about the fate of A2, at worst
he will feel sorry for A2.

(3) I think you can see that the situation is exactly symmetrical for
A2. A2 feels he will survive as B1, and the presence of A1 does not
detract from this feeling at all. At worst, if he learns about A1 he
will feel sorry for him (although if he follows this thought
experiment he will realise that A2 will survive as well). And B1, of
course, feels he has survived as A1/A2; it is meaningless to say he
has survived as one or the other since the scanned subjective content
is the same in each case.

The result is that A1, A2 and B1 are all happy. No-one feels he has
lost anything from a selfish perspective, even though from an external
perspective, for better or worse, there is one less consciousness in
the world. And calling this delusional does not invalidate it, since
it is exactly the sort of delusion we all have and attempt at all
costs to preserve in ordinary life.

>> Your measure-preserving criterion for determining when it's OK to kill a 
>> person is just something you have made up because you think it sounds 
>> reasonable, and has nothing to do with the wishes and feelings of the person 
>> getting killed.
>
> First, I should reiterate something I have already said: It is not generally 
> OK to kill someone without their permission even if you replace them.  The 
> reason it's not OK is just that it's like enslaving someone - you are forcing 
> things for them.  This has nothing particularly to do with killing; the same 
> would apply, for example, to cutting off someone's arm and replacing it with 
> a new one.  Even if the new one works fine, the guy has a right to be mad if 
> his permission was not asked for this.  That is an ethical issue.  I would 
> make an exception for a criminal or bad guy who I would want to imprison or 
> kill without his permission.

That's a fair enough point: we should not destructively copy people
who don't want to go through the procedure, just as we shouldn't
photograph them if they think it will steal their soul and leave them
upset as a result. But the point I am trying to make is that
destructive copying with reduction in measure is *not* killing anyone,
any more than going to sleep and waking up the next day kills anyone.

> That said, as my example of lying to the person shows, Stathis, your 
> criterion of caring about whether the person to be killed 'feels worried' is 
> irrelevant to the topic at hand.
>
> Measure preservation means that you are leaving behind the same number of 
> people you started with.  There is nothing arbitrary about that.  If, even 
> having obtained Bob's permission, you kill Bob, I'd say you deserve to be 
> punished if I think Bob had value.  But if you also replace him with Charlie, 
> then if I judge that Bob and Charlie are of equal value, I'd say you deserve 
> to be punished and rewarded by the same amount.  The same goes if you kill 
> Bob and Dave and replace them with Bob' and Dave', or if you kill 2 Bobs and 
> replace them with 2 other Bobs.  That is measure preservation.  If you kill 2 
> Bobs and replace them with only one then you deserve a net punishment.

That seems completely arbitrary. Bob might worry about his measure
being reduced, but Charlie might might get upset if his measure is
increased, for then they might be two Charlies with equal claim to the
original Charlie's possessions. Perhaps non-consensual one->many
copying will in future be considered a greater crime than many->one
copying, with one->none or many->none "copying" being a necessary
requirement for a crime to be labelled murder.

>> > Suppose there is a guy who is kind of a crazy oriental monk.  He meditates 
>> > and subjectively believes that he is now the reincarnation of ALL other 
>> > people.  Is it OK to now kill all other people and just leave alive this 
>> > one monk?
>>
>> No, because the people who are killed won't feel that they have continuity 
>> of consciousness with the monk, unless the monk really did run emulations of 
>> all of them in his mind.
>
> They don't know what's in his mind either way, so what they believe before 
> being killed is utterly irrelevant here.  We can suppose for arguments' sake 
> that they are all good peasants, they never miss giving their rice offerings, 
> and so they believe anything the monk tells them.  And he believes what he 
> says.
>
> Perhaps what you were trying to get at is that _after_they are killed, it 
> will be OK if they really do find themselves reincarnated in the monk.  But 
> who decides if that occurred or not?  The monk thinks it did; that criterion 
> would make his belief self-consistent.  Nor can you require the number of 
> people to be conserved - we know fission (as when learning the result of a QM 
> experiment) and fusion would be possible.  Nor can you use the criterion of 
> memory, unless you are prepared to say that memory loss changes one person 
> into a different person.  If so you will die when you forget where you put 
> your car keys.
>
> The reality is, there is no non-arbitrary criterion for personal identity 
> over time.
>
> Personal identity, being a matter of arbitrary definition, can have no 
> relevance to what is observable.   What matters is the measure distribution.

I agree that there is no non-arbitrary criterion for preservation of
identity over time. However, I do feel that I am the same person that
I was yesterday, even if I have insight into this being delusional.
It's a delusion I would like to continue in exactly the same way that
it has my whole life, and it wouldn't continue in that way with a
meditating monk unless he really did run an of me, which is something
that could be empirically tested.

>> The fact of the matter is that we are *not* the same physical being 
>> throughout our lives. The matter in our bodies, including our brains, turns 
>> over as a result of metabolic processes so that after a few months our 
>> physical makeup is almost completely different. It is just a contingent fact 
>> of our psychology that we consider ourselves to be the same person 
>> persisting through time. You could call it a delusion. I recognise that it 
>> is a delusion, but because my evolutionary program is so strong I can't 
>> shake it, nor do I want to. I want the delusion to continue in the same way 
>> it always has: the new version of me remembers being the old version of me 
>> and anticipates becoming the even newer version of me.
>
> If you really acknowledge that it is a delusion, that is good progress.  But 
> you are wrong that you can't shake it.  In fact, if you admit it is a 
> delusion then you admit that it is false and you have already shaken it.

I have insight into it, so technically it is an overvalued idea rather
than a delusion. Still, I don't wish to rid myself of it.

> Of course, my utility function remains strongly peaked in favor of people 
> very similar to my current self so that in practical terms I behave normally.

I think that's what I mean when I say that I want the
delusion/overvalued idea to continue.

>> And it wouldn't matter to me if more copies of me were destroyed than 
>> reconstituted or allowed to live, since each of the copies would continue to 
>> have the delusional belief that his consciousness will continue in the sole 
>> survivor.
>
> That is the heart of the matter.  Such a delusion is both false and 
> dangerous.  My task is to convince you, and others like you, that while your 
> current consciousness will not continue per se no matter what, the 
> consciousness of your future selves has value.  And the more of them there 
> are (in terms of measure) the more value, because they do not share a single 
> consciousness even if they are all of the same type.

I more or less agree with your statement of the objective facts, but
you can't sell me your view about the value in the number of future
copies of me. If pressed, the only value I can see in multiple copies
is as backup, although even that becomes irrelevant in an infinite
multiverse. I am quite sure that the view I have expressed is the
"natural" view given the way our psychology has evolved. For example,
given the choice of (a) doubling your measure and halving your income
or (b) halving your measure and doubling your income, I'm sure that
the vast majority of people would choose (b); because delusional or
not, it will seem to them that the only difference between (a) and (b)
is their income level.

> This is a key point so I'll try to illustrate it with a diagram:
>
> 1 ---------------------------------------------------
>
> 2 ---------------------------------------------------
>
> Say this represents 2 copies of Bob, with forward time being to the right.  
> Each "-" is an observer-moment (OM).  The copies remain identical until they 
> suddenly terminate, receiving the same kind of inputs and so on.  They evolve 
> independently though; perhaps they are many light-years apart.
>
> Now, suppose we have the option of killing copy #2 about halfway along Bob's 
> natural life.  If we do the diagram will now look like this:
>
> 1 -----------------------------------------
>
> 2 --------------------
>
> Copy #1 remains unchanged.  If we believe in a reasonable model for 
> consciousness, that is because local conditions give rise to consciousness in 
> a locally determined amount.  Clearly, there is less of Bob in this case 
> overall, and a typical OM in Bob's life occurs in the first half with an 
> effective probability of about 2/3.
>
> What would the diagram look like if there _were_ transfer of consciousness?  
> If the font displays each character the same width, roughly:
>
> 1 ---------------------====================
>
> 2 --------------------/
>
> Here the consciousness 'jumped' from 2 to 1.  This is what the QTI delusion 
> would have you believe.  It is impossible because of local generation of 
> consciousness and the arbitrariness of personal identity.  If you study Bob 
> #1 after the midlife, there is nothing unusual about his brain that could 
> give it double the normal measure.

The consciousness wouldn't "jump" from 2 to 1, since consciousness is
not a thing that jumps. Instead, the earlier OM's of 1 consider that
they have a future in the later OM's and the later OM's of 1 consider
that their past was in the earlier OM's of 1. If you look at 2, the
OM's up to midlife (and termination) are identical to the OM's of 1 up
to midlife. But the important point is this: even though 1 and 2 are
light years apart, 1 after midlife cannot claim to be the continuation
of the earlier 1 but not the continuation of 2, as there is no
"continuation" of anything as far as personal identity is concerned.
It's just that the older 1 remembers being the younger 1, and with
equal validity the younger 2, since their information content is
identical and identical information cannot be localised.


-- 
Stathis Papaioannou

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to