Stathis Papaioannou wrote:
On 05/07/07, Heartland <[EMAIL PROTECTED]> wrote:
...
But different moments of existence in a single person's life can also
be regarded as different instances. This is strikingly obvious in
block universe theories of time, which are empirically
indistinguishable fro
On 05/07/07, Heartland <[EMAIL PROTECTED]> wrote:
At this point it might be useful to think about why we lack access to subjective
experience of a different person. (Yes, I'm assuming my neighbor is a different
person. If you don't agree with this assumption (and if you don't please tell me
why)
On Jul 4, 2007, at 5:59 PM, Heartland wrote:
On Jul 4, 2007, at 1:14 AM, Tom McCabe wrote:
That definition isn't accurate, because it doesn't
match what we intuitively see as 'death'. 'Death' is
actually fairly easy to define, compared to "good" or
even "truth"; I would define it as the permane
On Jul 4, 2007, at 5:47 PM, Tom McCabe wrote:
--- Randall Randall <[EMAIL PROTECTED]>
wrote:
On Jul 4, 2007, at 3:17 PM, Tom McCabe wrote:
So, we die whenever we're put under anesthesia?
No, I don't think so.
But I thought you just defined death as "the cessation
of the process of life". If
Heartland:
I would suggest focusing on definition of life first. Only then one can have a
decent chance at getting the correct definition of death (absence of life).
Life is not just a collection of atoms arranged into a special pattern. It is,
at
least, a spatiotemporal process guided by a sp
Sure, but does it matter if I'm "dead" or "not dead" or
"physiologically dead but not information theoretically dead" between
the time my heart stops and the time when my upload is turned on? I
don't care, as long as the upload works. Although I guess I wouldn't
notice if I was dead and they could
On Jul 4, 2007, at 1:14 AM, Tom McCabe wrote:
That definition isn't accurate, because it doesn't
match what we intuitively see as 'death'. 'Death' is
actually fairly easy to define, compared to "good" or
even "truth"; I would define it as the permanent
destruction of a large portion of the inf
I think the debate is not so much over what qualifies
as "alive" as what qualifies as "death". Most people
couldn't care less about whether viruses are "really"
alive, but the death of 150,000 people a day affects
virtually everyone.
- Tom
--- Jey Kottalam <[EMAIL PROTECTED]> wrote:
> On 7/4/07
On 7/4/07, Tom McCabe <[EMAIL PROTECTED]> wrote:
--- Randall Randall <[EMAIL PROTECTED]>
wrote:
>
> On Jul 4, 2007, at 1:14 AM, Tom McCabe wrote:
>
> > That definition isn't accurate, because it doesn't
> > match what we intuitively see as 'death'. 'Death'
> is
> > actually fairly easy to defin
--- Randall Randall <[EMAIL PROTECTED]>
wrote:
>
> On Jul 4, 2007, at 3:17 PM, Tom McCabe wrote:
> > --- Randall Randall <[EMAIL PROTECTED]>
> > wrote:
> >> On Jul 4, 2007, at 1:14 AM, Tom McCabe wrote:
> >>> That definition isn't accurate, because it
> doesn't
> >>> match what we intuitively se
On Jul 4, 2007, at 3:17 PM, Tom McCabe wrote:
--- Randall Randall <[EMAIL PROTECTED]>
wrote:
On Jul 4, 2007, at 1:14 AM, Tom McCabe wrote:
That definition isn't accurate, because it doesn't
match what we intuitively see as 'death'. 'Death'
is
actually fairly easy to define, compared to "good
--- Randall Randall <[EMAIL PROTECTED]>
wrote:
>
> On Jul 4, 2007, at 1:14 AM, Tom McCabe wrote:
>
> > That definition isn't accurate, because it doesn't
> > match what we intuitively see as 'death'. 'Death'
> is
> > actually fairly easy to define, compared to "good"
> or
> > even "truth"; I wo
On 04/07/07, MindInstance <[EMAIL PROTECTED]> wrote:
I would suggest focusing on definition of life first. Only then one can have a
decent chance at getting the correct definition of death (absence of life).
Life is not just a collection of atoms arranged into a special pattern. It is,
at
leas
On 04/07/07, Heartland <[EMAIL PROTECTED]> wrote:
> Right, but Heartland disagrees, and the post was aimed at him and
> others who believe that "a copy isn't really you".
Stathis, I don't subscribe to your assertion that a person after gradual
replacement of atoms in his brain is a copy.
Yes,
On Jul 4, 2007, at 1:14 AM, Tom McCabe wrote:
That definition isn't accurate, because it doesn't
match what we intuitively see as 'death'. 'Death' is
actually fairly easy to define, compared to "good" or
even "truth"; I would define it as the permanent
destruction of a large portion of the info
On 04/07/07, Tom McCabe <[EMAIL PROTECTED]> wrote:
That definition isn't accurate, because it doesn't
match what we intuitively see as 'death'. 'Death' is
actually fairly easy to define, compared to "good" or
even "truth"; I would define it as the permanent
destruction of a large portion of the
Death isn't just the absence of life; it's the
cessation of life that once existed. The Bootes Void,
so far as we know, has no life at all, and yet nobody
feels it is a great tragedy.
- Tom
--- MindInstance <[EMAIL PROTECTED]> wrote:
> >> Objective observers care only about the type of a
> pers
Objective observers care only about the type of a person and whether it's
instantiated, not about the fate of its instances (because, frankly, they're not
aware of the difference between the type and an instance). But since I know
better,
I would be sad about dead instances. The point is whether
On 04/07/07, Tom McCabe <[EMAIL PROTECTED]> wrote:
That definition isn't accurate, because it doesn't
match what we intuitively see as 'death'. 'Death' is
actually fairly easy to define, compared to "good" or
even "truth"; I would define it as the permanent
destruction of a large portion of the i
That definition isn't accurate, because it doesn't
match what we intuitively see as 'death'. 'Death' is
actually fairly easy to define, compared to "good" or
even "truth"; I would define it as the permanent
destruction of a large portion of the information that
makes up a sentient being's mind.
-
On 04/07/07, Tom McCabe <[EMAIL PROTECTED]> wrote:
Using that definition, everyone would die at an age of
a few months, because the brain's matter is regularly
replaced by new organic chemicals.
I know that, which is why I asked the question. It's easy enough to
give a precise and objective def
Using that definition, everyone would die at an age of
a few months, because the brain's matter is regularly
replaced by new organic chemicals.
- Tom
--- Stathis Papaioannou <[EMAIL PROTECTED]> wrote:
> On 30/06/07, Heartland <[EMAIL PROTECTED]>
> wrote:
>
> > Objective observers care only abo
On 30/06/07, Heartland <[EMAIL PROTECTED]> wrote:
Objective observers care only about the type of a person and whether it's
intantiated, not about the fate of its instances (because, frankly, they're not
aware of the difference between the type and an instance). But since I know
better,
I would
--- Charles D Hixson <[EMAIL PROTECTED]>
wrote:
> Tom McCabe wrote:
> > -...
> > To quote:
> >
> > "I am not sure you are capable of following an
> > argument"
> >
> > If I'm not capable of even following an argument,
> it's
> > a pretty clear implication that I don't understand
> the
> > argumen
Tom McCabe wrote:
-...
To quote:
"I am not sure you are capable of following an
argument"
If I'm not capable of even following an argument, it's
a pretty clear implication that I don't understand the
argument.
You have thus far made no attempt that I have been able to detect to
justify the
--- Jef Allbright <[EMAIL PROTECTED]> wrote:
> On 7/2/07, Tom McCabe <[EMAIL PROTECTED]>
> wrote:
> > "
> > I am not sure you are capable of following an
> argument
> > in a manner that makes it worth my while to
> continue.
> >
> > - s"
> >
> > So, you're saying that I have no idea what I'm
> ta
On 7/2/07, Tom McCabe <[EMAIL PROTECTED]> wrote:
"
I am not sure you are capable of following an argument
in a manner that makes it worth my while to continue.
- s"
So, you're saying that I have no idea what I'm talking
about, so therefore you're not going to bother arguing
with me anymore. Thi
"
I am not sure you are capable of following an argument
in a manner that makes it worth my while to continue.
- s"
So, you're saying that I have no idea what I'm talking
about, so therefore you're not going to bother arguing
with me anymore. This is a classic example of an ad
hominem argument. T
Tom McCabe wrote:
--- Samantha Atkins <[EMAIL PROTECTED]> wrote:
Tom McCabe wrote:
--- Samantha Atkins <[EMAIL PROTECTED]> wrote:
Out of the bazillions of possible ways to
configure
matter only a
ridiculously tiny fraction are more intelligent
th
--- Samantha Atkins <[EMAIL PROTECTED]> wrote:
> Tom McCabe wrote:
> > --- Samantha Atkins <[EMAIL PROTECTED]> wrote:
> >
> >
> >>
> >> Out of the bazillions of possible ways to
> configure
> >> matter only a
> >> ridiculously tiny fraction are more intelligent
> than
> >> a cockroach. Yet
Colin Tate-Majcher wrote:
When you talk about "uploading" are you referring to creating a copy
of your consciousness? If that's the case then what do you do after
uploading, continue on with a mediocre existence while your
cyber-duplicate shoots past you? Sure, it would have all of those
won
Tom McCabe wrote:
--- Samantha Atkins <[EMAIL PROTECTED]> wrote:
Out of the bazillions of possible ways to configure
matter only a
ridiculously tiny fraction are more intelligent than
a cockroach. Yet
it did not take any grand design effort upfront to
arrive at a world
overrun when
On Jun 29, 2007, at 6:54 PM, Matt Mahoney wrote:
--- Randall Randall <[EMAIL PROTECTED]> wrote:
On Jun 28, 2007, at 7:51 PM, Matt Mahoney wrote:
--- Stathis Papaioannou <[EMAIL PROTECTED]> wrote:
How does this answer questions like, if I am destructively
teleported
to two different location
--- Randall Randall <[EMAIL PROTECTED]> wrote:
>
> On Jun 28, 2007, at 7:51 PM, Matt Mahoney wrote:
> > --- Stathis Papaioannou <[EMAIL PROTECTED]> wrote:
> >> How does this answer questions like, if I am destructively teleported
> >> to two different locations, what can I expect to experience?
Stathis:
> Although you make an exception when the copying takes place gradually
> inside your own head, switching atoms in your brain for new ones
> obtained from environmental raw materials, and excreting the original
> atoms.
There is no exception because the two cases are not equivalent whe
Stathis:
> Although you make an exception when the copying takes place gradually
> inside your own head, switching atoms in your brain for new ones
> obtained from environmental raw materials, and excreting the original
> atoms.
There is no exception because the two cases are not equivalent whe
On 29/06/07, Heartland <[EMAIL PROTECTED]> wrote:
Stathis:
> Although you make an exception when the copying takes place gradually
> inside your own head, switching atoms in your brain for new ones
> obtained from environmental raw materials, and excreting the original
> atoms.
There is no exce
On 29/06/07, Heartland <[EMAIL PROTECTED]> wrote:
The contradiction exists only in the minds of those who can't see or are unable
to
accept that "consciousness" doesn't transfer to a copy regardless of anything
else. Once this is clear, the imaginary paradox disappears. This paradox has
alw
On 29/06/07, Heartland <[EMAIL PROTECTED]> wrote:
The contradiction exists only in the minds of those who can't see or are unable
to
accept that "consciousness" doesn't transfer to a copy regardless of anything
else. Once this is clear, the imaginary paradox disappears. This paradox has
alway
I'm going to let the zombie thread die.
- Tom
--- Stathis Papaioannou <[EMAIL PROTECTED]> wrote:
> On 29/06/07, Tom McCabe <[EMAIL PROTECTED]>
> wrote:
>
> > But when you talk about "yourself", you mean the
> > "yourself" of the copy, not the "yourself" of the
> > original person. While all th
On 29/06/07, Tom McCabe <[EMAIL PROTECTED]> wrote:
But when you talk about "yourself", you mean the
"yourself" of the copy, not the "yourself" of the
original person. While all the copied selves can only
exist in one body, the original self can exist in more
than one body. You can pull this off
--- Randall Randall <[EMAIL PROTECTED]>
wrote:
>
> On Jun 28, 2007, at 11:26 PM, Tom McCabe wrote:
> > --- Randall Randall <[EMAIL PROTECTED]>
> > wrote:
> >> and
> >> What should a person before a copying experiment
> >> expect to remember, after the experiment? That
> is,
> >> what should he
On Jun 28, 2007, at 11:26 PM, Tom McCabe wrote:
--- Randall Randall <[EMAIL PROTECTED]>
wrote:
and
What should a person before a copying experiment
expect to remember, after the experiment? That is,
what should he anticipate?
Waking up as a copy, as this will be true for all the
copies which
--- Randall Randall <[EMAIL PROTECTED]>
wrote:
> On Jun 28, 2007, at 9:08 PM, Tom McCabe wrote:
> > --- Randall Randall <[EMAIL PROTECTED]>
> wrote:
> >> On Jun 28, 2007, at 7:35 PM, Tom McCabe wrote:
> >>> You're assuming again that consciousness is
> conserved.
> >> I have no idea why you think
On Jun 28, 2007, at 9:08 PM, Tom McCabe wrote:
--- Randall Randall <[EMAIL PROTECTED]> wrote:
On Jun 28, 2007, at 7:35 PM, Tom McCabe wrote:
You're assuming again that consciousness is conserved.
I have no idea why you think so. I would say that
I think that each copy is conscious only of the
On 28/06/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
When logic conflicts with instinct, instinct wins and the logic gets
contorted. The heated discussion on the copy paradox is a perfect example.
Your consciousness is tranferred to the copy only if the original is
destroyed, or destroyed in
--- Stathis Papaioannou <[EMAIL PROTECTED]> wrote:
> On 29/06/07, Tom McCabe <[EMAIL PROTECTED]>
> wrote:
>
> > I think
> > it works better to look at it from the perspective
> of
> > the guy doing the upload rather than the guy being
> > uploaded. If you magically inserted yourself into
> the
>
--- Stathis Papaioannou <[EMAIL PROTECTED]> wrote:
> On 29/06/07, Niels-Jeroen Vandamme
>
> > Personally, I do not believe in coincidence.
> Everything in the universe
> > might seem stochastic, but it all has a logical
> explanation. I believe the
> > same applies to quantum chaos, though quant
--- Stathis Papaioannou <[EMAIL PROTECTED]> wrote:
> On 29/06/07, Charles D Hixson
> <[EMAIL PROTECTED]> wrote:
>
> > > Yes, you would live on in one of the copies as
> if uploaded, and yes
> > > the selection of which copy would be purely
> random, dependent on the
> > > relative frequency of e
On 29/06/07, Tom McCabe <[EMAIL PROTECTED]> wrote:
I think
it works better to look at it from the perspective of
the guy doing the upload rather than the guy being
uploaded. If you magically inserted yourself into the
brain of a copy at random, then you're right- you'd
have an equal chance of wa
--- Randall Randall <[EMAIL PROTECTED]>
wrote:
> On Jun 28, 2007, at 7:35 PM, Tom McCabe wrote:
> > You're assuming again that consciousness is
> conserved.
>
> I have no idea why you think so. I would say that
> I think that each copy is conscious only of their
> own particular existence, and
On 29/06/07, Niels-Jeroen Vandamme
Personally, I do not believe in coincidence. Everything in the universe
might seem stochastic, but it all has a logical explanation. I believe the
same applies to quantum chaos, though quantum mechanics is still far too
recondite for us to understand this pheno
On 29/06/07, Charles D Hixson <[EMAIL PROTECTED]> wrote:
> Yes, you would live on in one of the copies as if uploaded, and yes
> the selection of which copy would be purely random, dependent on the
> relative frequency of each copy (you can still define a measure to
> derive probabilities even t
On Jun 28, 2007, at 7:51 PM, Matt Mahoney wrote:
--- Stathis Papaioannou <[EMAIL PROTECTED]> wrote:
How does this answer questions like, if I am destructively teleported
to two different locations, what can I expect to experience? That's
what I want to know before I press the button.
You have
On Jun 28, 2007, at 7:35 PM, Tom McCabe wrote:
You're assuming again that consciousness is conserved.
I have no idea why you think so. I would say that
I think that each copy is conscious only of their
own particular existence, and if that's what you
mean by "consciousness is conserved", I sup
--- Stathis Papaioannou <[EMAIL PROTECTED]> wrote:
> On 28/06/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> > So how do we approach the question of uploading without leading to a
> > contradiction? I suggest we approach it in the context of outside
> observers
> > simulating competing agents. Ho
--- Niels-Jeroen Vandamme <[EMAIL PROTECTED]> wrote:
> >A thermostat perceives the temperature and acts on it. Is it conscious?
>
> Registering does not equal perceiving. I mean subjective experience.
That's a subjective view of perception. If an entity says "I feel cold", is
it conscious? Or
You're assuming again that consciousness is conserved.
If you go into the machine full of magical sticky
consciousness stuff, and that stuff can only be
present in one body at any given time, then sure, when
you wake up, the probability of you waking up in box A
is 50% and the probability of waking
On Jun 28, 2007, at 5:59 PM, Tom McCabe wrote:
--- Randall Randall <[EMAIL PROTECTED]>
wrote:
On Jun 28, 2007, at 5:18 PM, Tom McCabe wrote:
How do you get the "50% chance"? There is a 100%
chance of a mind waking up who has been uploaded,
and
also a 100% chance of a mind waking up who hasn'
--- Randall Randall <[EMAIL PROTECTED]>
wrote:
>
> On Jun 28, 2007, at 5:18 PM, Tom McCabe wrote:
> > How do you get the "50% chance"? There is a 100%
> > chance of a mind waking up who has been uploaded,
> and
> > also a 100% chance of a mind waking up who hasn't.
> > This doesn't violate the l
--- Niels-Jeroen Vandamme
<[EMAIL PROTECTED]> wrote:
> >This is a textbook case of what Eliezer calls
> >"worshipping a sacred mystery". People tend to act
> >like a theoretical problem is some kind of God,
> >something above them in the social order, and since
> >it's beaten others before you it
On Jun 28, 2007, at 5:18 PM, Tom McCabe wrote:
How do you get the "50% chance"? There is a 100%
chance of a mind waking up who has been uploaded, and
also a 100% chance of a mind waking up who hasn't.
This doesn't violate the laws of probability because
these aren't mutually exclusive. Asking wh
This is a textbook case of what Eliezer calls
"worshipping a sacred mystery". People tend to act
like a theoretical problem is some kind of God,
something above them in the social order, and since
it's beaten others before you it would be wise to pay
due deference to it. Of course, a theoretical p
How do you get the "50% chance"? There is a 100%
chance of a mind waking up who has been uploaded, and
also a 100% chance of a mind waking up who hasn't.
This doesn't violate the laws of probability because
these aren't mutually exclusive. Asking which one "was
you" is silly, because we're assuming
--- Niels-Jeroen Vandamme
<[EMAIL PROTECTED]> wrote:
> >From: Charles D Hixson <[EMAIL PROTECTED]>
> >Reply-To: singularity@v2.listbox.com
> >To: singularity@v2.listbox.com
> >Subject: Re: [singularity] critiques of Eliezer's
> views on AI
> >Da
On Jun 28, 2007, at 12:56 PM, Charles D Hixson wrote:
Stathis Papaioannou wrote:
Yes, you would live on in one of the copies as if uploaded, and yes
the selection of which copy would be purely random, dependent on the
relative frequency of each copy (you can still define a measure to
derive pr
From: Charles D Hixson <[EMAIL PROTECTED]>
Reply-To: singularity@v2.listbox.com
To: singularity@v2.listbox.com
Subject: Re: [singularity] critiques of Eliezer's views on AI
Date: Thu, 28 Jun 2007 09:56:12 -0700
Stathis Papaioannou wrote:
On 28/06/07, Niels-Jeroen Vandamme <[E
Stathis Papaioannou wrote:
On 28/06/07, Niels-Jeroen Vandamme <[EMAIL PROTECTED]>
wrote:
An interesting thought experiment: if the universe is infinite,
according to
a ballpark estimate there would be an exact copy of you at a distance of
10^(10^29) m: because of the Bekenstein bound of the i
On 28/06/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
When logic conflicts with instinct, instinct wins and the logic gets
contorted. The heated discussion on the copy paradox is a perfect example.
Your consciousness is tranferred to the copy only if the original is
destroyed, or destroyed in ce
On 28/06/07, Niels-Jeroen Vandamme <[EMAIL PROTECTED]> wrote:
An interesting thought experiment: if the universe is infinite, according to
a ballpark estimate there would be an exact copy of you at a distance of
10^(10^29) m: because of the Bekenstein bound of the information of matter,
there ar
A thermostat perceives the temperature and acts on it. Is it conscious?
Registering does not equal perceiving. I mean subjective experience.
We think we know what consciousness is.
Actually, I'm quite aware that I don't. I find consciousness to be the
greatest puzzle in the universe, but i
<[EMAIL PROTECTED]>
> >Reply-To: singularity@v2.listbox.com
> >To: singularity@v2.listbox.com
> >Subject: Re: [singularity] critiques of Eliezer's views on AI
> >Date: Mon, 25 Jun 2007 17:19:20 -0700 (PDT)
> >
> >
> >--- Jey Kottalam <[E
On Wed, Jun 27, 2007 at 12:52:18PM -0500, Papiewski, John wrote:
> What I meant was, the only method a sane, informed, non-suicidal person
> would want to sign up for of their own free will.
I like to think that most cryonicists are sane, informed, and non-suicidal
when they sign their contracts.
What I meant was, the only method a sane, informed, non-suicidal person
would want to sign up for of their own free will.
Of course there are several physical methods that might upload your
personality, indistinguishable from you, but would kill YOU. So there's
no benefit to you, though nobody el
On 26/06/07, Eugen Leitl <[EMAIL PROTECTED]> wrote:
> If you don't destroy the original, then subjectively it would be like
> a transporter that only works half the time. The only frightening
Why? Are you assuming the first copy (original) remains stationary,
and the second gets transposed? If
On Tue, Jun 26, 2007 at 07:14:25PM +, Niels-Jeroen Vandamme wrote:
> ? Until the planet is overcrowded with your cyberclones.
Planet, shmanet. There's GLYrs of real estate right up there.
--
Eugen* Leitl http://leitl.org";>leitl http://leitl.org
__
Until the planet is overcrowded with your cyberclones.
From: "Nathan Cook" <[EMAIL PROTECTED]>
Reply-To: singularity@v2.listbox.com
To: singularity@v2.listbox.com
Subject: Re: [singularity] critiques of Eliezer's views on AI (was: Re:
Personal attacks)
Date: Tue, 26 Ju
On Tue, Jun 26, 2007 at 10:14:04AM -0700, Tom McCabe wrote:
> > How about 20-30 sec of stopped blood flow. Instant
> > flat EEG. Or, hypothermia. Or, anaesthesia (barbies
> > are nice)
>
> This is human life, remember, so we had better be darn
> sure that all neuronal activity whatsoever has
> sto
--- Eugen Leitl <[EMAIL PROTECTED]> wrote:
> On Mon, Jun 25, 2007 at 11:53:09PM -0700, Tom McCabe
> wrote:
>
> > Not so much "anesthetic" as "liquid helium", I
> think,
>
> How about 20-30 sec of stopped blood flow. Instant
> flat EEG. Or, hypothermia. Or, anaesthesia (barbies
> are nice)
This
On 25/06/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
--- Nathan Cook <[EMAIL PROTECTED]> wrote:
> I don't wish to retread old arguments, but there are a few theoretical
outs.
> One could be uploaded bit by bit, one neuron at a time if necessary. One
> could be rendered unconscious, frozen, and s
On Tue, Jun 26, 2007 at 10:39:19PM +1000, Stathis Papaioannou wrote:
> If you don't destroy the original, then subjectively it would be like
> a transporter that only works half the time. The only frightening
Why? Are you assuming the first copy (original) remains stationary,
and the second gets
On 26/06/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
What is wrong with this logic?
Captain Kirk willingly steps into the transporter to have his atoms turned
into energy because he knows an identical copy will be reassembled on the
surface of the planet below. Would he be so willing if the ori
On Mon, Jun 25, 2007 at 10:30:15PM -0400, Colin Tate-Majcher wrote:
>I doubt that this was the intention of Jean Roddenberry's
>interpretation of teleporting.
I have nothing but contempt for the fantasy physics of Star Drek
which has ruined entire generations, but
http://memory-alpha.org
On Mon, Jun 25, 2007 at 11:53:09PM -0700, Tom McCabe wrote:
> Not so much "anesthetic" as "liquid helium", I think,
How about 20-30 sec of stopped blood flow. Instant
flat EEG. Or, hypothermia. Or, anaesthesia (barbies
are nice)
> to be quadruply sure that all brain activity has
> stopped and th
On Mon, Jun 25, 2007 at 08:08:17PM -0400, Jey Kottalam wrote:
> On 6/25/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> >
> >You can only transfer
> >consciousness if you kill the original.
>
> What is the justification for this claim?
Because the copies diverge, unless subject to
synchronization b
On 5/31/07, Jey Kottalam <[EMAIL PROTECTED]> wrote:
on Google, but this returned 1,350 results. Are there any other
critiques I should be aware of? The only other one that I know of are
Bill Hibbard's at http://www.ssec.wisc.edu/~billh/g/mi.html . I
personally have not found much that I disagree
On Mon, Jun 25, 2007 at 06:20:51PM -0400, Colin Tate-Majcher wrote:
>
>When you talk about "uploading" are you referring to creating a copy
>of your consciousness? If that's the case then what do you do after
You die. The process is destructive.
>uploading, continue on with a medioc
Ants I'm not sure about, but many species are still
here only because we, as humans, are not simple
optimization processes that turn everything they see
into paperclips. Even so, we regularly do the exact
same thing that people say AIs won't do- we bulldoze
into some area, set up developments, and
(sigh) That's not the point. What Gene Roddenberry
thought, and whether Star Trek is real or not, are
totally irrelevant to the ethical issue of whether
"transportation" would be a good thing, and how it
should be done to minimize any possible harmful
effects.
- Tom
--- Colin Tate-Majcher <[EMAI
You're confusing memetics and genetics here, I think.
We couldn't possibly have an evolutionary instinct to
"believe in consciousness" because A), there's no
selection pressure for it as hunter-gatherers don't
think much about philosophy, and B) there hasn't been
enough time for such an instinct to
Because otherwise it would be a copy and not a
transfer. "Transfer" implies that it is moved from one
place to another and so only one being can exist when
the process is finished.
- Tom
--- Jey Kottalam <[EMAIL PROTECTED]> wrote:
> On 6/25/07, Matt Mahoney <[EMAIL PROTECTED]>
> wrote:
> >
> >
Not so much "anesthetic" as "liquid helium", I think,
to be quadruply sure that all brain activity has
stopped and the physical self and virtual self don't
diverge. People do have brain activity even while
unconscious.
- Tom
--- Jey Kottalam <[EMAIL PROTECTED]> wrote:
> On 6/25/07, Papiewski, J
Without consciousness, there could be no perception. I am surely conscious
right now, and how I am will remain a mystery for many years.
From: Matt Mahoney <[EMAIL PROTECTED]>
Reply-To: singularity@v2.listbox.com
To: singularity@v2.listbox.com
Subject: Re: [singularity] critiques of Eli
Alan Grimes wrote:
OTOH, let's consider a few scenario's where not super-human AI
develops. Instead there develops:
a) A cult of death that decides that humanity is a mistake, and decides
to solve the problem via genetically engineered plagues. (Well,
diseases. I don't specifically mean plague
Matt Mahoney wrote:
--- Tom McCabe <[EMAIL PROTECTED]> wrote:
These questions, although important, have little to do
with the feasibility of FAI.
These questions are important because AGI is coming, friendly or not. Will
our AGIs cooperate or compete? Do we upload ourselves?
...
-
> OTOH, let's consider a few scenario's where not super-human AI
> develops. Instead there develops:
> a) A cult of death that decides that humanity is a mistake, and decides
> to solve the problem via genetically engineered plagues. (Well,
> diseases. I don't specifically mean plague.)
http://
Kaj Sotala wrote:
On 6/22/07, Charles D Hixson <[EMAIL PROTECTED]> wrote:
Dividing things into us vs. them, and calling those that side with us
friendly seems to be instinctually human, but I don't think that it's a
universal. Even then, we are likely to ignore birds, ants that are
outside, and
Captain Kirk willingly steps into the transporter to have his atoms turned
into energy because he knows an identical copy will be reassembled on the
surface of the planet below. Would he be so willing if the original was
left
behind?
I doubt that this was the intention of Jean Roddenberry's inte
On 6/22/07, Charles D Hixson <[EMAIL PROTECTED]> wrote:
And *my* best guess is that most super-humanly intelligent AIs will just
choose to go elsewhere, and leave us alone.
My personal opinion is Intelligence explosions, whether artificial or not,
lead to great diversity and varied personalit
--- Jey Kottalam <[EMAIL PROTECTED]> wrote:
> On 6/25/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> >
> > You can only transfer
> > consciousness if you kill the original.
>
> What is the justification for this claim?
There is none, which is what I was trying to argue. Consciousness does not
a
1 - 100 of 121 matches
Mail list logo