Re: measure again '10

2010-02-03 Thread Nick Prince


On Feb 2, 7:07 am, Jack Mallah  wrote:
> --- On Wed, 1/27/10, Brent Meeker  wrote:
>
> > Jack is talking about copies in the common sense of initially physically 
> > identical beings who however occupy different places in the same spacetime 
> > and hence have different viewpoints and experiences.
>
> No, that's incorrect.  I don't know where you got that idea but I'd best put 
> that misconception to rest first.
>
> When I talk about copies I mean the same thing as the others on this list - 
> beings who not only start out as the same type but also receive the same type 
> of inputs and follow the same type of sequence of events.  Note: They follow 
> the same sequence because they use the same algorithm but they must operate 
> independently and in parallel - there are no causal links to enforce it.  If 
> there are causal links forcing them to be in lockstep I might say they are 
> shadows, not copies.
>
> Such copies each have their own, separate consciousness - it just happens to 
> be of the same type as that of the others.  It is not "redundancy" in the 
> sense of needless redundancy.  Killing one would end that consciousness, 
> yes.  In philosophy jargon, they are of the same type but are different 
> tokens of it.
>
> --- On Thu, 1/28/10, Jason Resch  wrote:
>
> > Total utilitarianism advocates measuring the utility of a population based 
> > on the total utility of its members.
> > Average utilitarianism, on the other hand, advocates measuring the utility 
> > of a population based on the average utility of that population.
>
> I basically endorse total utilitarianism.  (I'm actually a bit more 
> conservative but that isn't relevant here.)  I would say that average 
> utilitarianism is completely insane and evil.  Ending the existence of a 
> suffering person can be positive, but only if the quality of life of that 
> person is negative.  Such a person would probably want to die.  OTOH not 
> everyone who wants to die has negative utility, even if they think they do.
>
> --- On Wed, 1/27/10, Stathis Papaioannou  wrote:
>
> > if there were a million copies of me in lockstep and all but one were 
> > destroyed, then each of the million copies would feel that they had 
> > continuity of consciousness with the remaining one, so they are OK with 
> > what is about to happen.
>
> Suppose someone killed all copies but lied to them first, saying that they 
> would survive.  They would not feel worried.  Would that be OK?  It seems 
> like the same idea to me.
>
> > Your measure-preserving criterion for determining when it's OK to kill a 
> > person is just something you have made up because you think it sounds 
> > reasonable, and has nothing to do with the wishes and feelings of the 
> > person getting killed.
>
> First, I should reiterate something I have already said: It is not generally 
> OK to kill someone without their permission even if you replace them.  The 
> reason it's not OK is just that it's like enslaving someone - you are forcing 
> things for them.  This has nothing particularly to do with killing; the same 
> would apply, for example, to cutting off someone's arm and replacing it with 
> a new one.  Even if the new one works fine, the guy has a right to be mad if 
> his permission was not asked for this.  That is an ethical issue.  I would 
> make an exception for a criminal or bad guy who I would want to imprison or 
> kill without his permission.
>
> That said, as my example of lying to the person shows, Stathis, your 
> criterion of caring about whether the person to be killed 'feels worried' is 
> irrelevant to the topic at hand.
>
> Measure preservation means that you are leaving behind the same number of 
> people you started with.  There is nothing arbitrary about that.  If, even 
> having obtained Bob's permission, you kill Bob, I'd say you deserve to be 
> punished if I think Bob had value.  But if you also replace him with Charlie, 
> then if I judge that Bob and Charlie are of equal value, I'd say you deserve 
> to be punished and rewarded by the same amount.  The same goes if you kill 
> Bob and Dave and replace them with Bob' and Dave', or if you kill 2 Bobs and 
> replace them with 2 other Bobs.  That is measure preservation.  If you kill 2 
> Bobs and replace them with only one then you deserve a net punishment.
>
> > > Suppose there is a guy who is kind of a crazy oriental monk.  He 
> > > meditates and subjectively believes that he is now the reincarnation of 
> > > ALL other people.  Is it OK to now kill all other people and just leave 
> > > alive this one monk?
>
> > No, because the people who are killed won't feel that they have continuity 
> > of consciousness with the monk, unless the monk really did run emulations 
> > of all of them in his mind.
>
> They don't know what's in his mind either way, so what they believe before 
> being killed is utterly irrelevant here.  We can suppose for arguments' sake 
> that they are all good peasants, they never miss gi

Re: measure again '10

2010-02-02 Thread Stathis Papaioannou
On 2 February 2010 18:07, Jack Mallah  wrote:

> --- On Wed, 1/27/10, Stathis Papaioannou  wrote:
>> if there were a million copies of me in lockstep and all but one were 
>> destroyed, then each of the million copies would feel that they had 
>> continuity of consciousness with the remaining one, so they are OK with what 
>> is about to happen.
>
> Suppose someone killed all copies but lied to them first, saying that they 
> would survive.  They would not feel worried.  Would that be OK?  It seems 
> like the same idea to me.

You're assuming that it is false to say each of the million copies
would survive if all but one were destroyed. You seem to think that it
is OK if the (destructive) copying is one->one  but not if it is
many->one. In coming to this conclusion it seems you are taking the
general moral position that it is good to increase the number of
people in the world and bad to decrease it. However, for the purposes
of this discussion what is at issue is the *purely selfish*
perspective, not what is good for other people or the rest of the
world. From the purely selfish perspective, it does seem to you and to
me to every other person that we survive if a future copy of us
exists, even though arguably this is just a delusion. And the delusion
of survival is perfectly maintained for each of the multiple copies
destroyed in a many->one replication, as perfectly as it is maintained
for a single copy destroyed in a one->many replication.

Here is a scenario to consider:

(1) A1 is destructively scanned, and from the scan a copy B1 is made.
A1 looks forward to surviving as B1, and B1 considers that he is the
same person as A1 and has survived. In a sense these beliefs may be
delusional, but A1 is happy and B1 is happy. After all, this is what
happens in ordinary life.

(2) As in (1), but in the next room there is another copy A2 in
lockstep with A1. A2 is destroyed when A1 is destroyed in the
destructive scanning. Now, does the presence of A2 diminish A1's
expectation that he will survive in B1, making the delusion somehow
less compelling or more delusional? I don't see by what process that
could possibly happen. A1 still feels he will survive in B1 so A1 is
happy; and if he learns at some point about the fate of A2, at worst
he will feel sorry for A2.

(3) I think you can see that the situation is exactly symmetrical for
A2. A2 feels he will survive as B1, and the presence of A1 does not
detract from this feeling at all. At worst, if he learns about A1 he
will feel sorry for him (although if he follows this thought
experiment he will realise that A2 will survive as well). And B1, of
course, feels he has survived as A1/A2; it is meaningless to say he
has survived as one or the other since the scanned subjective content
is the same in each case.

The result is that A1, A2 and B1 are all happy. No-one feels he has
lost anything from a selfish perspective, even though from an external
perspective, for better or worse, there is one less consciousness in
the world. And calling this delusional does not invalidate it, since
it is exactly the sort of delusion we all have and attempt at all
costs to preserve in ordinary life.

>> Your measure-preserving criterion for determining when it's OK to kill a 
>> person is just something you have made up because you think it sounds 
>> reasonable, and has nothing to do with the wishes and feelings of the person 
>> getting killed.
>
> First, I should reiterate something I have already said: It is not generally 
> OK to kill someone without their permission even if you replace them.  The 
> reason it's not OK is just that it's like enslaving someone - you are forcing 
> things for them.  This has nothing particularly to do with killing; the same 
> would apply, for example, to cutting off someone's arm and replacing it with 
> a new one.  Even if the new one works fine, the guy has a right to be mad if 
> his permission was not asked for this.  That is an ethical issue.  I would 
> make an exception for a criminal or bad guy who I would want to imprison or 
> kill without his permission.

That's a fair enough point: we should not destructively copy people
who don't want to go through the procedure, just as we shouldn't
photograph them if they think it will steal their soul and leave them
upset as a result. But the point I am trying to make is that
destructive copying with reduction in measure is *not* killing anyone,
any more than going to sleep and waking up the next day kills anyone.

> That said, as my example of lying to the person shows, Stathis, your 
> criterion of caring about whether the person to be killed 'feels worried' is 
> irrelevant to the topic at hand.
>
> Measure preservation means that you are leaving behind the same number of 
> people you started with.  There is nothing arbitrary about that.  If, even 
> having obtained Bob's permission, you kill Bob, I'd say you deserve to be 
> punished if I think Bob had value.  But if you also replace him with 

Re: measure again '10

2010-02-01 Thread Brent Meeker

Jack Mallah wrote:

--- On Wed, 1/27/10, Brent Meeker  wrote:
  

Jack is talking about copies in the common sense of initially physically 
identical beings who however occupy different places in the same spacetime and 
hence have different viewpoints and experiences.



No, that's incorrect.  I don't know where you got that idea but I'd best put 
that misconception to rest first.

When I talk about copies I mean the same thing as the others on this list - 
beings who not only start out as the same type but also receive the same type 
of inputs and follow the same type of sequence of events.  Note: They follow 
the same sequence because they use the same algorithm but they must operate 
independently and in parallel - there are no causal links to enforce it.  If 
there are causal links forcing them to be in lockstep I might say they are 
shadows, not copies.
  


I see don't that as possible except possibly by realizing the two copies 
in two virtual realities so that whole environment is simulated.  And 
the simulated worlds would have to be completely deterministic - no 
quantum randomnes.




Such copies each have their own, separate consciousness - it just happens to be of the 
same type as that of the others.  It is not "redundancy" in the sense of 
needless redundancy.  Killing one would end that consciousness, yes.  In philosophy 
jargon, they are of the same type but are different tokens of it.
  


Philosophy jargon doesn't require that two of the same type be the same 
in every respect, e.g. A and A are two tokens of the same type, but they 
are not identical (one is the left of the other for example).


Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: measure again '10

2010-02-01 Thread Jack Mallah
--- On Wed, 1/27/10, Brent Meeker  wrote:
> Jack is talking about copies in the common sense of initially physically 
> identical beings who however occupy different places in the same spacetime 
> and hence have different viewpoints and experiences.

No, that's incorrect.  I don't know where you got that idea but I'd best put 
that misconception to rest first.

When I talk about copies I mean the same thing as the others on this list - 
beings who not only start out as the same type but also receive the same type 
of inputs and follow the same type of sequence of events.  Note: They follow 
the same sequence because they use the same algorithm but they must operate 
independently and in parallel - there are no causal links to enforce it.  If 
there are causal links forcing them to be in lockstep I might say they are 
shadows, not copies.

Such copies each have their own, separate consciousness - it just happens to be 
of the same type as that of the others.  It is not "redundancy" in the sense of 
needless redundancy.  Killing one would end that consciousness, yes.  In 
philosophy jargon, they are of the same type but are different tokens of it.

--- On Thu, 1/28/10, Jason Resch  wrote:
> Total utilitarianism advocates measuring the utility of a population based on 
> the total utility of its members.
> Average utilitarianism, on the other hand, advocates measuring the utility of 
> a population based on the average utility of that population.

I basically endorse total utilitarianism.  (I'm actually a bit more 
conservative but that isn't relevant here.)  I would say that average 
utilitarianism is completely insane and evil.  Ending the existence of a 
suffering person can be positive, but only if the quality of life of that 
person is negative.  Such a person would probably want to die.  OTOH not 
everyone who wants to die has negative utility, even if they think they do.

--- On Wed, 1/27/10, Stathis Papaioannou  wrote:
> if there were a million copies of me in lockstep and all but one were 
> destroyed, then each of the million copies would feel that they had 
> continuity of consciousness with the remaining one, so they are OK with what 
> is about to happen.

Suppose someone killed all copies but lied to them first, saying that they 
would survive.  They would not feel worried.  Would that be OK?  It seems like 
the same idea to me.

> Your measure-preserving criterion for determining when it's OK to kill a 
> person is just something you have made up because you think it sounds 
> reasonable, and has nothing to do with the wishes and feelings of the person 
> getting killed.

First, I should reiterate something I have already said: It is not generally OK 
to kill someone without their permission even if you replace them.  The reason 
it's not OK is just that it's like enslaving someone - you are forcing things 
for them.  This has nothing particularly to do with killing; the same would 
apply, for example, to cutting off someone's arm and replacing it with a new 
one.  Even if the new one works fine, the guy has a right to be mad if his 
permission was not asked for this.  That is an ethical issue.  I would make an 
exception for a criminal or bad guy who I would want to imprison or kill 
without his permission.

That said, as my example of lying to the person shows, Stathis, your criterion 
of caring about whether the person to be killed 'feels worried' is irrelevant 
to the topic at hand.

Measure preservation means that you are leaving behind the same number of 
people you started with.  There is nothing arbitrary about that.  If, even 
having obtained Bob's permission, you kill Bob, I'd say you deserve to be 
punished if I think Bob had value.  But if you also replace him with Charlie, 
then if I judge that Bob and Charlie are of equal value, I'd say you deserve to 
be punished and rewarded by the same amount.  The same goes if you kill Bob and 
Dave and replace them with Bob' and Dave', or if you kill 2 Bobs and replace 
them with 2 other Bobs.  That is measure preservation.  If you kill 2 Bobs and 
replace them with only one then you deserve a net punishment.

> > Suppose there is a guy who is kind of a crazy oriental monk.  He meditates 
> > and subjectively believes that he is now the reincarnation of ALL other 
> > people.  Is it OK to now kill all other people and just leave alive this 
> > one monk?
> 
> No, because the people who are killed won't feel that they have continuity of 
> consciousness with the monk, unless the monk really did run emulations of all 
> of them in his mind. 

They don't know what's in his mind either way, so what they believe before 
being killed is utterly irrelevant here.  We can suppose for arguments' sake 
that they are all good peasants, they never miss giving their rice offerings, 
and so they believe anything the monk tells them.  And he believes what he says.

Perhaps what you were trying to get at is that _after_they are killed, it will 
be OK if they

Re: measure again '10

2010-01-27 Thread Jason Resch
Jack,

What you mentioned ending the existence of a suffering copy can be positive.
 I am curious, would you consider ending any observer whose quality of life
was less than the average weighted (by number of copies) quality of life of
all observers everywhere?  Consider this example:
http://en.wikipedia.org/wiki/Utilitarianism#Average_v_total

Total utilitarianism
 advocates
measuring the utility of a population based on the total utility of its
members. According to Derek Parfit,
this type of utilitarianism falls victim to the Repugnant
Conclusion,
whereby large numbers of people with very low but non-negative utility
values can be seen as a better goal than a population of a less extreme size
living in comfort. In other words, according to the theory, it is a moral
good to breed more people on the world for as long as total happiness rises.
[13] 

Average utilitarianism ,
on the other hand, advocates measuring the utility of a population based on
the average utility of that population. It avoids Parfit's repugnant
conclusion, but causes other problems like the Mere Addition Paradox. For
example, bringing a moderately happy person in a very happy world would be
seen as an immoral act; aside from this, the theory implies that it would be
a moral good to eliminate all people whose happiness is below average, as
this would raise the average
happiness[14].
This could however be circumvented by assigning a low utility score to dead
people, and taking them into account in the average.


I think applying one of these philosophies could shed some light on the
inherent goodness or badness when it comes to ending a copy.


Jason

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: measure again '10

2010-01-27 Thread Stathis Papaioannou
On 28 January 2010 05:31, Brent Meeker  wrote:

> If I understand you correctly, your discussion of "copies" really refers to
> copies that exist in different identical worlds, e.g. like different copies
> of the same AI running in identical virtual environments, so that they can
> run "in lockstep".  Jack is talking about copies in the common sense of
> initially physically identical beings who however occupy different places in
> the same spacetime and hence have different viewpoints and experiences.  So
> killing one of Jack's copies does end a separate stream of consciousness,
> while "killing" one copy of your lockstep copies just reduces the
> redundancy.  I note that such lockstep copies can only be realized if they
> are in different (virtual) worlds which are kept in lockstep in spite of
> quantum randomness.

Yes, that's what I meant by "copies". But they could also run in
perfect lockstep as parallel processes on a computer.


-- 
Stathis Papaioannou

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: measure again '10

2010-01-27 Thread Brent Meeker

Stathis Papaioannou wrote:

2010/1/27 Jack Mallah :

  

See above. That would be a measure-conserving process, so it would be OK.



I would be upset at the prospect of someone killing me even if they
filled the world with angelic beings by way of atonement, because it
would not feel as if any of them were me. On the other hand, if there
were a million copies of me in lockstep and all but one were
destroyed, then each of the million copies would feel that they had
continuity of consciousness with the remaining one, so they are OK
with what is about to happen. Your measure-preserving criterion for
determining when it's OK to kill a person is just something you have
made up because you think it sounds reasonable, and has nothing to do
with the wishes and feelings of the person getting killed.

  

It is just a matter of definition whether it is the same guy or a different 
guy. Because now we have one guy at a time, it is convenient to call them the 
same guy.  If we had two at once, we could call them the same if we like, but 
the fact would remain that they would have different (even if qualitatively the 
same) consciousnesses, so it is better to call them different guys.



Making two copies running in lockstep and killing one of them is equivalent to 
this: the one that is killed feels that his stream of consciousness continues 
in the one that is not killed. It is true that in the second case the number of 
living copies of the person has halved, but from the point of view of each copy 
it is exactly the same as the first case, where there is only ever one copy 
extant.
  

That one that is killed doesn't feel anything after he is killed.  The one that 
lives experiences whatever he would have experienced anyway.  There is NO 
TRANSFER of consciousness.  Killing a guy (assuming he is not an evil guy or in 
great pain) and not creating a new guy to replace him is always a net loss.



The general point is that what matters to the person is not the objective 
physical events, but the subjective effect that the objective physical events 
will have.
  

What matters is the objective reality that includes all subjective experiences.

Suppose there is a guy who is kind of a crazy oriental monk.  He meditates and 
subjectively believes that he is now the reincarnation of ALL other people.  Is 
it OK to now kill all other people and just leave alive this one monk?



No, because the people who are killed won't feel that they have
continuity of consciousness with the monk, unless the monk really did
run emulations of all of them in his mind. The fact of the matter is
that we are *not* the same physical being throughout our lives. The
matter in our bodies, including our brains, turns over as a result of
metabolic processes so that after a few months our physical makeup is
almost completely different. It is just a contingent fact of our
psychology that we consider ourselves to be the same person persisting
through time. You could call it a delusion. I recognise that it is a
delusion, but because my evolutionary program is so strong I can't
shake it, nor do I want to. I want the delusion to continue in the
same way it always has: the new version of me remembers being the old
version of me and anticipates becoming the even newer version of me.
If I'm killed with no remaining copies, the delusion ends, and that's
bad. If I go through ordinary life being destroyed and recreated by my
cellular machinery that's OK, and if I am destroyed and then
reconstituted then that's OK too, because the delusion is preserved.
And it wouldn't matter to me if more copies of me were destroyed than
reconstituted or allowed to live, since each of the copies would
continue to have the delusional belief that his consciousness will
continue in the sole survivor.


  
If I understand you correctly, your discussion of "copies" really refers 
to copies that exist in different identical worlds, e.g. like different 
copies of the same AI running in identical virtual environments, so that 
they can run "in lockstep".  Jack is talking about copies in the common 
sense of initially physically identical beings who however occupy 
different places in the same spacetime and hence have different 
viewpoints and experiences.  So killing one of Jack's copies does end a 
separate stream of consciousness, while "killing" one copy of your 
lockstep copies just reduces the redundancy.  I note that such lockstep 
copies can only be realized if they are in different (virtual) worlds 
which are kept in lockstep in spite of quantum randomness.


Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: measure again '10

2010-01-27 Thread Stathis Papaioannou
2010/1/27 Jack Mallah :

> See above. That would be a measure-conserving process, so it would be OK.

I would be upset at the prospect of someone killing me even if they
filled the world with angelic beings by way of atonement, because it
would not feel as if any of them were me. On the other hand, if there
were a million copies of me in lockstep and all but one were
destroyed, then each of the million copies would feel that they had
continuity of consciousness with the remaining one, so they are OK
with what is about to happen. Your measure-preserving criterion for
determining when it's OK to kill a person is just something you have
made up because you think it sounds reasonable, and has nothing to do
with the wishes and feelings of the person getting killed.

> It is just a matter of definition whether it is the same guy or a different 
> guy. Because now we have one guy at a time, it is convenient to call them the 
> same guy.  If we had two at once, we could call them the same if we like, but 
> the fact would remain that they would have different (even if qualitatively 
> the same) consciousnesses, so it is better to call them different guys.
>
>> Making two copies running in lockstep and killing one of them is equivalent 
>> to this: the one that is killed feels that his stream of consciousness 
>> continues in the one that is not killed. It is true that in the second case 
>> the number of living copies of the person has halved, but from the point of 
>> view of each copy it is exactly the same as the first case, where there is 
>> only ever one copy extant.
>
> That one that is killed doesn't feel anything after he is killed.  The one 
> that lives experiences whatever he would have experienced anyway.  There is 
> NO TRANSFER of consciousness.  Killing a guy (assuming he is not an evil guy 
> or in great pain) and not creating a new guy to replace him is always a net 
> loss.
>
>> The general point is that what matters to the person is not the objective 
>> physical events, but the subjective effect that the objective physical 
>> events will have.
>
> What matters is the objective reality that includes all subjective 
> experiences.
>
> Suppose there is a guy who is kind of a crazy oriental monk.  He meditates 
> and subjectively believes that he is now the reincarnation of ALL other 
> people.  Is it OK to now kill all other people and just leave alive this one 
> monk?

No, because the people who are killed won't feel that they have
continuity of consciousness with the monk, unless the monk really did
run emulations of all of them in his mind. The fact of the matter is
that we are *not* the same physical being throughout our lives. The
matter in our bodies, including our brains, turns over as a result of
metabolic processes so that after a few months our physical makeup is
almost completely different. It is just a contingent fact of our
psychology that we consider ourselves to be the same person persisting
through time. You could call it a delusion. I recognise that it is a
delusion, but because my evolutionary program is so strong I can't
shake it, nor do I want to. I want the delusion to continue in the
same way it always has: the new version of me remembers being the old
version of me and anticipates becoming the even newer version of me.
If I'm killed with no remaining copies, the delusion ends, and that's
bad. If I go through ordinary life being destroyed and recreated by my
cellular machinery that's OK, and if I am destroyed and then
reconstituted then that's OK too, because the delusion is preserved.
And it wouldn't matter to me if more copies of me were destroyed than
reconstituted or allowed to live, since each of the copies would
continue to have the delusional belief that his consciousness will
continue in the sole survivor.


-- 
Stathis Papaioannou

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: measure again '10

2010-01-27 Thread Bruno Marchal


Le 26-janv.-10, à 22:29, Jack Mallah a écrit :


--- On Tue, 1/26/10, Bruno Marchal  wrote:

On 25 Jan 2010, at 23:16, Jack Mallah wrote:

Killing one man is not OK just because he has a brother.


In our context, the 'brother' has the same consciousness.


The "brother" most certainly does not have "the same" consciousness. 
If he did, then killing him would not change the total _amount_ of 
consciousness; measure would be conserved. What the brother does have 
is his own, different in terms of who experiences it, but 
qualitatively identical consciousness.


But then my consciousness here and now is different from my 
consciousness after a short interval, and is no more something related 
to "me" in any sense of "me" acceptable with the comp assumption. It 
seems that this make my consciousness here and now infinitely 
implausible in the absolute measure (if this makes sense).


I would also not say yes to a computationalist doctor, because my 
consciousness will be related to the diameter of the simulated neurons, 
or to the redundancy of the gates, etc.  (and this despite the behavior 
remains unaffected). This entails also the existence of zombie. If the 
neurons are very thin , my "absolute" measure can be made quasi null, 
despite my behavior remains again non affected.





From this I conclude you would say "no" to the doctor. All right? The 
doctor certainly "kill a 'brother' ".


As you should know by now Bruno, if you are now talking about a 
teleportation experiment, in that case you kill one guy (bad) but 
create another, qualitatively identical guy (good).  So the net effect 
is OK.  Of course the doctor should get the guy's permission before 
doing anything, if he can.


Certainly. But if you mean by this that you say "yes", it is only as a 
form of altruism. *you* and *your* consciousness die in the process. 
The net effect is OK, but only for your mother, friends, or any third 
person observer. Comp, as I use the term is that it is OK in the usual 
sense of surviving a clinical operation.




BTW, it may seem that I advocate increased population - that is, if we 
had a cloning device, we should use it.  In general, yes, but a planet 
has a limited capacity to support a high population over a long term, 
which we may have already exceeded.  Too much at once will result in a 
lower total population over time due to a ruined environment as well 
as lower quality of life.  So in practice, it would cause problems.  
But if we has a second planet available and the question is should we 
populate it, I'd say yes.


Apparently we agree on what we disagree. You position is not 
computationalism, where identity does not depend on the implementation 
of the program, and two computers running the same program can be seen 
as a special implementation of one program, like in spacecraft.




--- On Mon, 1/25/10, Stathis Papaioannou  wrote:

Killing a man is bad because he doesn't want to be killed,


Actually that's not why - but let that pass for now.

and he doesn't want to be killed because he believes that act would 
cause his stream of consciousness to end. However, if he were killed 
and his stream of consciousness continued, that would not be a 
problem provided that the manner of death was not painful. Backing up 
his mind, killing him and then making an exact copy of the man at the 
moment before death is an example of this process.


See above. That would be a measure-conserving process, so it would be 
OK.


But the measure will depend on the implementation type, with single or 
doubled neurons, etc. And this is not relevant if we are digital 
machine (program).





It is just a matter of definition whether it is the same guy or a 
different guy. Because now we have one guy at a time, it is convenient 
to call them the same guy.  If we had two at once, we could call them 
the same if we like, but the fact would remain that they would have 
different (even if qualitatively the same) consciousnesses, so it is 
better to call them different guys.


They would have different consciousness (but qualitatively identical) 
only if they are genuinely different guy, so it could not be matter of 
definition. But then what is *a* guy?




Making two copies running in lockstep and killing one of them is 
equivalent to this: the one that is killed feels that his stream of 
consciousness continues in the one that is not killed. It is true 
that in the second case the number of living copies of the person has 
halved, but from the point of view of each copy it is exactly the 
same as the first case, where there is only ever one copy extant.


That one that is killed doesn't feel anything after he is killed.  The 
one that lives experiences whatever he would have experienced anyway.  
There is NO TRANSFER of consciousness.  Killing a guy (assuming he is 
not an evil guy or in great pain) and not creating a new guy to 
replace him is always a net loss.


For who?




The general point is that what matters t

Re: measure again '10

2010-01-26 Thread Nick Prince
Thank you Jack for your response.

>That one that is killed doesn't feel anything after he is killed.  The one 
>that lives experiences whatever he would have experienced anyway.  There is NO 
>TRANSFER of consciousness.  Killing a guy (assuming he is not an evil guy or 
>in great pain) and not creating a new guy to replace him is always a net loss.

It seems that at the root of things you are arguing that if you have
one person up until time t and can make  a so called identical copy at
that time (or another), then whether the original is killed or not, if
the new copy is instantiated then it would "feel" and therefore think
itself to be the person it was (because of memories) but that would be
illusion(Bit like the droids in blade runner).  Conversely the person
who is to get copied should be worried because once killed he does not
experience the future (as the copy).

Forgetting about MWI for now and just thinking about why I feel some
continuity in my subjective experience.  I feel that it must have
something to do with the fact that me at time t+dt is an (almost)
identical copy to the me at time t.  If I deny this then I could
accept there was no TRANSFER of consciousness between copies.  Yet my
experience makes me feel that there is?

Best wishes

Nick Prince

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: measure again '10

2010-01-26 Thread Jack Mallah
--- On Tue, 1/26/10, Bruno Marchal  wrote:
> On 25 Jan 2010, at 23:16, Jack Mallah wrote:
> > Killing one man is not OK just because he has a brother.
>  
> In our context, the 'brother' has the same consciousness.

The "brother" most certainly does not have "the same" consciousness. If he did, 
then killing him would not change the total _amount_ of consciousness; measure 
would be conserved. What the brother does have is his own, different in terms 
of who experiences it, but qualitatively identical consciousness.

> From this I conclude you would say "no" to the doctor. All right? The doctor 
> certainly "kill a 'brother' ".

As you should know by now Bruno, if you are now talking about a teleportation 
experiment, in that case you kill one guy (bad) but create another, 
qualitatively identical guy (good).  So the net effect is OK.  Of course the 
doctor should get the guy's permission before doing anything, if he can.

BTW, it may seem that I advocate increased population - that is, if we had a 
cloning device, we should use it.  In general, yes, but a planet has a limited 
capacity to support a high population over a long term, which we may have 
already exceeded.  Too much at once will result in a lower total population 
over time due to a ruined environment as well as lower quality of life.  So in 
practice, it would cause problems.  But if we has a second planet available and 
the question is should we populate it, I'd say yes.

--- On Mon, 1/25/10, Stathis Papaioannou  wrote:
> Killing a man is bad because he doesn't want to be killed,

Actually that's not why - but let that pass for now.

> and he doesn't want to be killed because he believes that act would cause his 
> stream of consciousness to end. However, if he were killed and his stream of 
> consciousness continued, that would not be a problem provided that the manner 
> of death was not painful. Backing up his mind, killing him and then making an 
> exact copy of the man at the moment before death is an example of this 
> process.

See above. That would be a measure-conserving process, so it would be OK.

It is just a matter of definition whether it is the same guy or a different 
guy. Because now we have one guy at a time, it is convenient to call them the 
same guy.  If we had two at once, we could call them the same if we like, but 
the fact would remain that they would have different (even if qualitatively the 
same) consciousnesses, so it is better to call them different guys.

> Making two copies running in lockstep and killing one of them is equivalent 
> to this: the one that is killed feels that his stream of consciousness 
> continues in the one that is not killed. It is true that in the second case 
> the number of living copies of the person has halved, but from the point of 
> view of each copy it is exactly the same as the first case, where there is 
> only ever one copy extant.

That one that is killed doesn't feel anything after he is killed.  The one that 
lives experiences whatever he would have experienced anyway.  There is NO 
TRANSFER of consciousness.  Killing a guy (assuming he is not an evil guy or in 
great pain) and not creating a new guy to replace him is always a net loss.

> The general point is that what matters to the person is not the objective 
> physical events, but the subjective effect that the objective physical events 
> will have.

What matters is the objective reality that includes all subjective experiences.

Suppose there is a guy who is kind of a crazy oriental monk.  He meditates and 
subjectively believes that he is now the reincarnation of ALL other people.  Is 
it OK to now kill all other people and just leave alive this one monk?




  

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.



Re: measure again '10

2010-01-26 Thread Bruno Marchal


On 25 Jan 2010, at 23:16, Jack Mallah wrote:


Killing one man is not OK just because he has a brother.



In our context, the 'brother' has the same consciousness. From this I  
conclude you would say "no" to the doctor. All right? The doctor  
certainly "kill a 'brother' ".


Bruno Marchal


http://iridia.ulb.ac.be/~marchal/



--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.