Re: [Fwd: NDPR David Shoemaker, Personal Identity and Ethics: A Brief Introduction]

2009-03-03 Thread Stathis Papaioannou

2009/3/4 Günther Greindl :

> Imagine the sequence:
>
> Scan - Annihilate - Signal - Reconstitute
>
> Now consider that the Signal travels for 100 000 lightyears
> before it hits the reconstitution chamber (just to
> have a big distance, the concern is causal disconnection in spacetime).
>
> Now, in the meantime, the reconstitution chamber has been overtaken by
> aliens (coming from the other side of the galaxy) who have advanced
> technology and can control the multiverse - they decide the tweak the
> multiverse that the reconstitution happens in _no_ multiverse at all (by
> destroying all chambers).
>
> This would suggest that the no cul de sac conjecture implies that
> annihilation in the above sequence fails.
>
> But surely this can not depend on the decision of the aliens, who were
> nowhere near the causal lightcone of the annihilation event.
>
> This would imply one of three things (in my view in decreasing degree of
> plausibility):
>
> .) no cul-de-sac is false; no QI, even in RSSA scenarios.
> .) annihilation always fails. That is, if a copying machine exists,
> there will always be a version of you which feels that copying has not
> succeeded and "nothing happened" (even if you said you wanted to be
> annihilated after the duplication).
> .) COMP obeys global super-selection rules, akin to pre-determinism;
> that is, in scenarios where aliens destroy the chambers, annihilation
> fails, else not. Analogously for other scenarios.

The no-cul-de-sac hypothesis is false if you allow that there is some
means of destroying all copies in the multiverse. But there is
probably no such means, no matter how advanced the aliens.


-- 
Stathis Papaioannou

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---



Re: [Fwd: NDPR David Shoemaker, Personal Identity and Ethics: A Brief Introduction]

2009-03-03 Thread Stathis Papaioannou

2009/3/4 Bruno Marchal :

> That is why the B people made a law, for helping those who
> misunderstand the probability. If you decide (before duplication) to
> kill the copy, the choice of victim/torturer is still decided through
> a throw of a fair coin. This makes the decision unbiased by fake
> protocols based on a bad understanding of what the comp probabilities
> are.

Yes, deciding which copy will take on which role by a coin toss would
probably eliminate dynasties of torturers. This is an interesting
point, since the fact that the coin toss is introduced does not
actually do anything ta change the probability that you will end up
being tortured, its effect being mainly psychological.


-- 
Stathis Papaioannou

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---



Re: [Fwd: NDPR David Shoemaker, Personal Identity and Ethics: A Brief Introduction]

2009-03-03 Thread Günther Greindl

Hi,

> better: this is just the "usual" comp-suicide self-selection (assuming  
> of course we can really kill the copies, which is in itself not an  
> obvious proposition).


I have been thinking along these lines lately, in a somewhat different
context: the teleportation with annihilation experiment together with
the no cul de sac conjecture and RSSA (that is, a case not covered by
Jack's paper).

Imagine the sequence:

Scan - Annihilate - Signal - Reconstitute

Now consider that the Signal travels for 100 000 lightyears
before it hits the reconstitution chamber (just to
have a big distance, the concern is causal disconnection in spacetime).

Now, in the meantime, the reconstitution chamber has been overtaken by
aliens (coming from the other side of the galaxy) who have advanced
technology and can control the multiverse - they decide the tweak the
multiverse that the reconstitution happens in _no_ multiverse at all (by
destroying all chambers).

This would suggest that the no cul de sac conjecture implies that
annihilation in the above sequence fails.

But surely this can not depend on the decision of the aliens, who were
nowhere near the causal lightcone of the annihilation event.

This would imply one of three things (in my view in decreasing degree of
plausibility):

.) no cul-de-sac is false; no QI, even in RSSA scenarios.
.) annihilation always fails. That is, if a copying machine exists,
there will always be a version of you which feels that copying has not
succeeded and "nothing happened" (even if you said you wanted to be 
annihilated after the duplication).
.) COMP obeys global super-selection rules, akin to pre-determinism;
that is, in scenarios where aliens destroy the chambers, annihilation
fails, else not. Analogously for other scenarios.


Cheers,
Günther


--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---



Re: [Fwd: NDPR David Shoemaker, Personal Identity and Ethics: A Brief Introduction]

2009-03-03 Thread Bruno Marchal


On 03 Mar 2009, at 13:40, Stathis Papaioannou wrote:

>
> 2009/3/3 Bruno Marchal :
>
>> I think that comp practitioners will divide, in the long run,  along
>> three classes:
>>
>> A:  majority. Accept teleportation but disallow overlap of
>> "individuals": annihilation first, reconstitution after. No right to
>> self-infliction. In case of accidental or exceptional self-
>> multiplication, consent is asked at any time.
>> B: a stable minority (in the long run). Accept teleportation but do
>> allow overlap of individuals. Some will fight for the right of self-
>> infliction including the consent made before the duplication, but  
>> with
>> precise protocol. You know the problem of the masochist: I say no,
>> continue, I say "no no", stop!
>> C:  the bandits. They violates protocols and don't ask for consents.
>> They should normally be wanted, I mean researched by all the polices
>> of the universe, or already be in jail or in asylum.
>
> I think B might work, since it is more or less like the present
> situation, where our decisions are based on a rough risk-benefit
> analysis, i.e. we decide on a course of action if as a result
> gain*Pr(gain) >= loss*Pr(loss). So we decide to smoke, for example, if
> we judge the pleasure of smoking (or the suffering caused by trying to
> give it up) to outweigh the suffering that may result from
> smoking-related illnesses. However, there are also differences if the
> copies are allowed to overlap. If I make a decision that has an
> adverse effect on my future self I may regret the decision, but it's
> not possible to ask my past self to reverse it. On the other hand, if
> I agree for one of my copies to torture the other it is always
> possible for the victim to ask the torturer to release him. Also, it
> is possible for the torturer to come to believe that he is never at
> risk himself after repeated duplications: I've done this many times
> and it's always the *other* guy who suffers, not me, so there is no
> reason for me not to repeat the process. This would be so even if the
> agreement was for 100 copies to be made and 99 of them enslaved: the
> one who does the enslaving may come to believe that he is never at
> risk, and continue creating copies 100 at a time.


You can then imagine the surprise of the copy or copies:  "- I did  
this often and thought there are no risk, but here I am enslaved, and  
I will suffer and die".

That is why the B people made a law, for helping those who  
misunderstand the probability. If you decide (before duplication) to  
kill the copy, the choice of victim/torturer is still decided through  
a throw of a fair coin. This makes the decision unbiased by fake  
protocols based on a bad understanding of what the comp probabilities  
are.

Iterating the procedure, with the throwing of the coin, could make you  
believe you are incredibly lucky, but the computationalist should know  
better: this is just the "usual" comp-suicide self-selection (assuming  
of course we can really kill the copies, which is in itself not an  
obvious proposition).

Bruno


http://iridia.ulb.ac.be/~marchal/




--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---



Re: [Fwd: NDPR David Shoemaker, Personal Identity and Ethics: A Brief Introduction]

2009-03-03 Thread ronaldheld

Stathis
  This was mentioned in the TNG technical manual. I do not recall,
right, now, which post TOS episodes mentioned it.
  Ronald

On Mar 2, 8:42 am, Stathis Papaioannou  wrote:
> 2009/3/2 ronaldheld :
>
>
>
> > Maybe the terminology does not fit here, to make a copy of my brain,
> > wouldn't you need more than memories, but the state of the brain at
> > one time to "quantum resolution" (TNG transporter term).
>
> The question is what level of resolution is needed in order to copy
> the memories, personality etc. You may not need quantum resolution,
> since in that case it is hard to see how you could avoid drastic
> mental state changes while just sitting still. Also, in which TNG
> episode does it mention quantum resolution for the transporter?
>
> --
> Stathis Papaioannou
--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---



Re: [Fwd: NDPR David Shoemaker, Personal Identity and Ethics: A Brief Introduction]

2009-03-03 Thread Stathis Papaioannou

2009/3/3 Bruno Marchal :

> I think that comp practitioners will divide, in the long run,  along
> three classes:
>
> A:  majority. Accept teleportation but disallow overlap of
> "individuals": annihilation first, reconstitution after. No right to
> self-infliction. In case of accidental or exceptional self-
> multiplication, consent is asked at any time.
> B: a stable minority (in the long run). Accept teleportation but do
> allow overlap of individuals. Some will fight for the right of self-
> infliction including the consent made before the duplication, but with
> precise protocol. You know the problem of the masochist: I say no,
> continue, I say "no no", stop!
> C:  the bandits. They violates protocols and don't ask for consents.
> They should normally be wanted, I mean researched by all the polices
> of the universe, or already be in jail or in asylum.

I think B might work, since it is more or less like the present
situation, where our decisions are based on a rough risk-benefit
analysis, i.e. we decide on a course of action if as a result
gain*Pr(gain) >= loss*Pr(loss). So we decide to smoke, for example, if
we judge the pleasure of smoking (or the suffering caused by trying to
give it up) to outweigh the suffering that may result from
smoking-related illnesses. However, there are also differences if the
copies are allowed to overlap. If I make a decision that has an
adverse effect on my future self I may regret the decision, but it's
not possible to ask my past self to reverse it. On the other hand, if
I agree for one of my copies to torture the other it is always
possible for the victim to ask the torturer to release him. Also, it
is possible for the torturer to come to believe that he is never at
risk himself after repeated duplications: I've done this many times
and it's always the *other* guy who suffers, not me, so there is no
reason for me not to repeat the process. This would be so even if the
agreement was for 100 copies to be made and 99 of them enslaved: the
one who does the enslaving may come to believe that he is never at
risk, and continue creating copies 100 at a time.


-- 
Stathis Papaioannou

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---



Re: [Fwd: NDPR David Shoemaker, Personal Identity and Ethics: A Brief Introduction]

2009-03-03 Thread Stathis Papaioannou

2009/3/3 Bruno Marchal :

> I think that comp practitioners will divide, in the long run,  along
> three classes:
>
> A:  majority. Accept teleportation but disallow overlap of
> "individuals": annihilation first, reconstitution after. No right to
> self-infliction. In case of accidental or exceptional self-
> multiplication, consent is asked at any time.
> B: a stable minority (in the long run). Accept teleportation but do
> allow overlap of individuals. Some will fight for the right of self-
> infliction including the consent made before the duplication, but with
> precise protocol. You know the problem of the masochist: I say no,
> continue, I say "no no", stop!
> C:  the bandits. They violates protocols and don't ask for consents.
> They should normally be wanted, I mean researched by all the polices
> of the universe, or already be in jail or in asylum.

I think B might work, since it is more or less like the present
situation, where our decisions are based on a rough risk-benefit
analysis, i.e. we decide on a course of action if as a result
gain*Pr(gain) >= loss*Pr(loss). So we decide to smoke, for example, if
we judge the pleasure of smoking (or the suffering caused by trying to
give it up) to outweigh the suffering that may result from
smoking-related illnesses. However, there are also differences if the
copies are allowed to overlap. If I make a decision that has an
adverse effect on my future self I may regret the decision, but it's
not possible to ask my past self to reverse it. On the other hand, if
I agree for one of my copies to torture the other it is always
possible for the victim to ask the torturer to release him. Also, it
is possible for the torturer to come to believe that he is never at
risk himself after repeated duplications: I've done this many times
and it's always the *other* guy who suffers, not me, so there is no
reason for me not to repeat the process. This would be so even if the
agreement was for 100 copies to be made and 99 of them enslaved: the
one who does the enslaving may come to believe that he is never at
risk, and continue creating copies 100 at a time.


-- 
Stathis Papaioannou

--~--~-~--~~~---~--~~
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en
-~--~~~~--~~--~--~---