You're confusing memetics and genetics here, I think.
We couldn't possibly have an evolutionary instinct to
"believe in consciousness" because A), there's no
selection pressure for it as hunter-gatherers don't
think much about philosophy, and B) there hasn't been
enough time for such an instinct to
Because otherwise it would be a copy and not a
transfer. "Transfer" implies that it is moved from one
place to another and so only one being can exist when
the process is finished.
- Tom
--- Jey Kottalam <[EMAIL PROTECTED]> wrote:
> On 6/25/07, Matt Mahoney <[EMAIL PROTECTED]>
> wrote:
> >
> >
Not so much "anesthetic" as "liquid helium", I think,
to be quadruply sure that all brain activity has
stopped and the physical self and virtual self don't
diverge. People do have brain activity even while
unconscious.
- Tom
--- Jey Kottalam <[EMAIL PROTECTED]> wrote:
> On 6/25/07, Papiewski, J
Without consciousness, there could be no perception. I am surely conscious
right now, and how I am will remain a mystery for many years.
From: Matt Mahoney <[EMAIL PROTECTED]>
Reply-To: singularity@v2.listbox.com
To: singularity@v2.listbox.com
Subject: Re: [singularity] critiques of Eliezer's v
Alan Grimes wrote:
OTOH, let's consider a few scenario's where not super-human AI
develops. Instead there develops:
a) A cult of death that decides that humanity is a mistake, and decides
to solve the problem via genetically engineered plagues. (Well,
diseases. I don't specifically mean plague
Matt Mahoney wrote:
--- Tom McCabe <[EMAIL PROTECTED]> wrote:
These questions, although important, have little to do
with the feasibility of FAI.
These questions are important because AGI is coming, friendly or not. Will
our AGIs cooperate or compete? Do we upload ourselves?
...
-
> OTOH, let's consider a few scenario's where not super-human AI
> develops. Instead there develops:
> a) A cult of death that decides that humanity is a mistake, and decides
> to solve the problem via genetically engineered plagues. (Well,
> diseases. I don't specifically mean plague.)
http://
Kaj Sotala wrote:
On 6/22/07, Charles D Hixson <[EMAIL PROTECTED]> wrote:
Dividing things into us vs. them, and calling those that side with us
friendly seems to be instinctually human, but I don't think that it's a
universal. Even then, we are likely to ignore birds, ants that are
outside, and
Captain Kirk willingly steps into the transporter to have his atoms turned
into energy because he knows an identical copy will be reassembled on the
surface of the planet below. Would he be so willing if the original was
left
behind?
I doubt that this was the intention of Jean Roddenberry's inte
On 6/22/07, Charles D Hixson <[EMAIL PROTECTED]> wrote:
And *my* best guess is that most super-humanly intelligent AIs will just
choose to go elsewhere, and leave us alone.
My personal opinion is Intelligence explosions, whether artificial or not,
lead to great diversity and varied personalit
--- Jey Kottalam <[EMAIL PROTECTED]> wrote:
> On 6/25/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> >
> > You can only transfer
> > consciousness if you kill the original.
>
> What is the justification for this claim?
There is none, which is what I was trying to argue. Consciousness does not
a
On 6/25/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
You can only transfer
consciousness if you kill the original.
What is the justification for this claim?
-Jey Kottalam
-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
htt
What is wrong with this logic?
Captain Kirk willingly steps into the transporter to have his atoms turned
into energy because he knows an identical copy will be reassembled on the
surface of the planet below. Would he be so willing if the original was left
behind?
This is a case of logic conflic
Papiewski, John wrote:
> You’re not misunderstanding and it is horrible.
>
> The only way to do it is to gradually replace your brain cells with an
> artificial substitute.
>
> You’d be barely aware that something is going on, and there wouldn’t be
> two copies of you to be confused over.
Good
On 6/25/07, Papiewski, John <[EMAIL PROTECTED]> wrote:
The only way to do it is to gradually replace your brain cells with an
artificial substitute.
We could instead anesthetize the crap out of you, upload you, turn on
your upload, and then make soylent green out of your original.
-Jey Kotta
You're not misunderstanding and it is horrible.
The only way to do it is to gradually replace your brain cells with an
artificial substitute.
You'd be barely aware that something is going on, and there wouldn't be
two copies of you to be confused over.
For example, imagine a medica
When you talk about "uploading" are you referring to creating a copy of your
consciousness? If that's the case then what do you do after uploading,
continue on with a mediocre existence while your cyber-duplicate shoots past
you? Sure, it would have all of those wonderful abilities you mention,
--- Nathan Cook <[EMAIL PROTECTED]> wrote:
> I don't wish to retread old arguments, but there are a few theoretical outs.
> One could be uploaded bit by bit, one neuron at a time if necessary. One
> could be rendered unconscious, frozen, and scanned. I would find this
> frightening, but preferable
On 24/06/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
Do we upload? Consider the copy paradox. If there was an exact copy of
you,
atom for atom, and you had to choose between killing the copy or yourself,
I
think you would choose to kill the copy (and the copy would choose to kill
you). Does
19 matches
Mail list logo