--- On Wed, 1/27/10, Brent Meeker <meeke...@dslextreme.com> wrote:
> Jack is talking about copies in the common sense of initially physically 
> identical beings who however occupy different places in the same spacetime 
> and hence have different viewpoints and experiences.

No, that's incorrect.  I don't know where you got that idea but I'd best put 
that misconception to rest first.

When I talk about copies I mean the same thing as the others on this list - 
beings who not only start out as the same type but also receive the same type 
of inputs and follow the same type of sequence of events.  Note: They follow 
the same sequence because they use the same algorithm but they must operate 
independently and in parallel - there are no causal links to enforce it.  If 
there are causal links forcing them to be in lockstep I might say they are 
shadows, not copies.

Such copies each have their own, separate consciousness - it just happens to be 
of the same type as that of the others.  It is not "redundancy" in the sense of 
needless redundancy.  Killing one would end that consciousness, yes.  In 
philosophy jargon, they are of the same type but are different tokens of it.

--- On Thu, 1/28/10, Jason Resch <jasonre...@gmail.com> wrote:
> Total utilitarianism advocates measuring the utility of a population based on 
> the total utility of its members.
> Average utilitarianism, on the other hand, advocates measuring the utility of 
> a population based on the average utility of that population.

I basically endorse total utilitarianism.  (I'm actually a bit more 
conservative but that isn't relevant here.)  I would say that average 
utilitarianism is completely insane and evil.  Ending the existence of a 
suffering person can be positive, but only if the quality of life of that 
person is negative.  Such a person would probably want to die.  OTOH not 
everyone who wants to die has negative utility, even if they think they do.

--- On Wed, 1/27/10, Stathis Papaioannou <stath...@gmail.com> wrote:
> if there were a million copies of me in lockstep and all but one were 
> destroyed, then each of the million copies would feel that they had 
> continuity of consciousness with the remaining one, so they are OK with what 
> is about to happen.

Suppose someone killed all copies but lied to them first, saying that they 
would survive.  They would not feel worried.  Would that be OK?  It seems like 
the same idea to me.

> Your measure-preserving criterion for determining when it's OK to kill a 
> person is just something you have made up because you think it sounds 
> reasonable, and has nothing to do with the wishes and feelings of the person 
> getting killed.

First, I should reiterate something I have already said: It is not generally OK 
to kill someone without their permission even if you replace them.  The reason 
it's not OK is just that it's like enslaving someone - you are forcing things 
for them.  This has nothing particularly to do with killing; the same would 
apply, for example, to cutting off someone's arm and replacing it with a new 
one.  Even if the new one works fine, the guy has a right to be mad if his 
permission was not asked for this.  That is an ethical issue.  I would make an 
exception for a criminal or bad guy who I would want to imprison or kill 
without his permission.

That said, as my example of lying to the person shows, Stathis, your criterion 
of caring about whether the person to be killed 'feels worried' is irrelevant 
to the topic at hand.

Measure preservation means that you are leaving behind the same number of 
people you started with.  There is nothing arbitrary about that.  If, even 
having obtained Bob's permission, you kill Bob, I'd say you deserve to be 
punished if I think Bob had value.  But if you also replace him with Charlie, 
then if I judge that Bob and Charlie are of equal value, I'd say you deserve to 
be punished and rewarded by the same amount.  The same goes if you kill Bob and 
Dave and replace them with Bob' and Dave', or if you kill 2 Bobs and replace 
them with 2 other Bobs.  That is measure preservation.  If you kill 2 Bobs and 
replace them with only one then you deserve a net punishment.

> > Suppose there is a guy who is kind of a crazy oriental monk.  He meditates 
> > and subjectively believes that he is now the reincarnation of ALL other 
> > people.  Is it OK to now kill all other people and just leave alive this 
> > one monk?
> 
> No, because the people who are killed won't feel that they have continuity of 
> consciousness with the monk, unless the monk really did run emulations of all 
> of them in his mind. 

They don't know what's in his mind either way, so what they believe before 
being killed is utterly irrelevant here.  We can suppose for arguments' sake 
that they are all good peasants, they never miss giving their rice offerings, 
and so they believe anything the monk tells them.  And he believes what he says.

Perhaps what you were trying to get at is that _after_they are killed, it will 
be OK if they really do find themselves reincarnated in the monk.  But who 
decides if that occurred or not?  The monk thinks it did; that criterion would 
make his belief self-consistent.  Nor can you require the number of people to 
be conserved - we know fission (as when learning the result of a QM experiment) 
and fusion would be possible.  Nor can you use the criterion of memory, unless 
you are prepared to say that memory loss changes one person into a different 
person.  If so you will die when you forget where you put your car keys.  

The reality is, there is no non-arbitrary criterion for personal identity over 
time.

Personal identity, being a matter of arbitrary definition, can have no 
relevance to what is observable.   What matters is the measure distribution.

> The fact of the matter is that we are *not* the same physical being 
> throughout our lives. The matter in our bodies, including our brains, turns 
> over as a result of metabolic processes so that after a few months our 
> physical makeup is almost completely different. It is just a contingent fact 
> of our psychology that we consider ourselves to be the same person persisting 
> through time. You could call it a delusion. I recognise that it is a 
> delusion, but because my evolutionary program is so strong I can't shake it, 
> nor do I want to. I want the delusion to continue in the same way it always 
> has: the new version of me remembers being the old version of me and 
> anticipates becoming the even newer version of me.

If you really acknowledge that it is a delusion, that is good progress.  But 
you are wrong that you can't shake it.  In fact, if you admit it is a delusion 
then you admit that it is false and you have already shaken it.

Of course, my utility function remains strongly peaked in favor of people very 
similar to my current self so that in practical terms I behave normally.

> And it wouldn't matter to me if more copies of me were destroyed than 
> reconstituted or allowed to live, since each of the copies would continue to 
> have the delusional belief that his consciousness will continue in the sole 
> survivor.

That is the heart of the matter.  Such a delusion is both false and dangerous.  
My task is to convince you, and others like you, that while your current 
consciousness will not continue per se no matter what, the consciousness of 
your future selves has value.  And the more of them there are (in terms of 
measure) the more value, because they do not share a single consciousness even 
if they are all of the same type.

--- On Tue, 1/26/10, Nick Prince <m...@dtech.fsnet.co.uk> wrote:
> It seems that at the root of things you are arguing that if you have one 
> person up until time t and can make a so called identical copy at that time 
> (or another), then whether the original is killed or not, if the new copy is 
> instantiated then it would "feel" and therefore think itself to be the person 
> it was (because of memories) but that would be illusion (Bit like the droids 
> in blade runner). 

Nick, you too seem to be missing the larger point that personal identity is not 
fundamental.  Assuming they both live, the new copy has as much claim to be the 
original as the future version of the original body.  I could equally say that 
_neither_ of these future people are the same person as the past original.  It 
is not a meaningful question.  What IS meaningful is that copying increases the 
number of consciousnesses, while killing decreases it.

> Forgetting about MWI for now and just thinking about why I feel some 
> continuity in my subjective experience.  I feel that it must have something 
> to do with the fact that me at time t+dt is an (almost) identical copy to the 
> me at time t.  If I deny this then I could accept there was no TRANSFER of 
> consciousness between copies.  Yet my experience makes me feel that there is?

Your brain gives rise to consciousness at time t1.  It also give rise to 
consciousness at time t2.  Was there any transfer of consciousness from time t1 
to t2?  No, because whether your brain gives rise to consciousness at time t 
depends only on the situation at time t.  If time t1 never existed and your 
brain sprang into existence just before t2, in the same state as t2 otherwise 
has, there would be no difference at t2.

Of course, there is transfer of information from t1 to t2 due to the laws of 
physics, even if your brain went unconscious in the meantime; this is 
causality.  When we talk about different copies, even this is completely out of 
the picture; there is no information transfer between copies.

This is a key point so I'll try to illustrate it with a diagram:

1 ---------------------------------------------------

2 ---------------------------------------------------

Say this represents 2 copies of Bob, with forward time being to the right.  
Each "-" is an observer-moment (OM).  The copies remain identical until they 
suddenly terminate, receiving the same kind of inputs and so on.  They evolve 
independently though; perhaps they are many light-years apart.

Now, suppose we have the option of killing copy #2 about halfway along Bob's 
natural life.  If we do the diagram will now look like this:

1 -----------------------------------------

2 --------------------

Copy #1 remains unchanged.  If we believe in a reasonable model for 
consciousness, that is because local conditions give rise to consciousness in a 
locally determined amount.  Clearly, there is less of Bob in this case overall, 
and a typical OM in Bob's life occurs in the first half with an effective 
probability of about 2/3.

What would the diagram look like if there _were_ transfer of consciousness?  If 
the font displays each character the same width, roughly:

1 ---------------------====================

2 --------------------/

Here the consciousness 'jumped' from 2 to 1.  This is what the QTI delusion 
would have you believe.  It is impossible because of local generation of 
consciousness and the arbitrariness of personal identity.  If you study Bob #1 
after the midlife, there is nothing unusual about his brain that could give it 
double the normal measure.

I also note that the QTI idea is empirically shown to be false because we are 
not older than the typical human lifespan, as we would be with nearly certain 
effective probability had it been true.

--- On Wed, 1/27/10, Bruno Marchal <marc...@ulb.ac.be> wrote:
> You position is not computationalism

That's BS, but in this thread I want to concentrate on the measure issue, which 
is completely general regardless of whether the substrate is computationalist 
or not.  Bruno, I will reply in the 'problem of size' thread.

> In what sense does an objective reality include a subjective experience? This 
> is a highly ambiguous way to talk. It could entail confusion of level of 
> description, and perspective.

If a given subjective experience exists at all - if it is a feature of reality 
- then any complete and objective description of reality would include it, and 
all others.  There is nothing ambiguous or confusing about it.  It is obviously 
true, by definition of "complete".





      

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-l...@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to