Matt,
This is a textbook example of the way that all discussions of the
consequences of a singularity tend to go.
What you have done here is to repeat the same song heard over and over
again from people who criticise the singularity on the grounds that one
or another nightmare will *obviously* happen. No distinction between a
fantasy of the worst that could happen, on the one hand, and a
realistic, critical assessment of what is likely to happen, on the other.
Thus, summarizing what you just said:
1) A SAI *could* allow us to upload to super powerful computers (part of
a vast network, etc. etc.) .... so therefore it *will* force this upon us.
2) A SAI *could* allow us to get rid of all the "living organisms" since
they are not needed (presumably you mean our bodies), so therefore the
SAI *will* force this upon us.
3) You insinuate that the SAI will *insist* that we don't need "all
those low level sensory processing and motor skills [we] learned over a
lifetime" and so therefore the SAI *will* deprive us of them.
4) You insinuate that the SAI will *insist* that we should get rid of
any bad memories from childhood, if they trouble us, and so therefore it
*will* do this to us whether we want it to or not.
You present all of these as if they would happen against our will, as if
they would be forced upon the human race. You don't come right out and
say this, you just list all of these nightmare scenarios and then
conclude that "your nervousness is justified". But nowhere do you even
consider the possibility that any SAI that did this would be stupid and
vicious .... you implicitly assume that even the best-case SAI would be
this bad.
If, instead, you had said:
5) People could *choose* to upload into super powerful computers
connected to simulated worlds, if they felt like it (instead of staying
as they are and augmenting their minds when the fancy took them) ....
but although some people probably would, most would chose not to do this.
6) Some might *choose* to do the above and also destroy their bodies.
Probably not many, and even those who did could at any later time decide
to relocate back into reconstructed versions of their old bodies, so it
would be no big deal either way.
7) Some people might *choose* to dispense with the learned motor and
sensory skills that were specific to their natural bodies ... but again,
most would not (why would they bother to do this?), and they could
always restore them later if they felt like it.
8) Some people might *choose* to erase painful memories. They might
also take the precaution of storing them somewhere, so they could change
their minds and retrieve them in the future.
..... then the alternative conclusion would be: sounds like there is no
problem with this.
Your version (items 1-4) was presented without any justification for why
the SAI would impose its will instead of simply offering us lifestyle
choices. Why?
Your presentation here is just a classic example: every single debate
or discussion of the consequences of the singularity, it seems, is
totally dominated by this kind of sloppy thinking.
Richard Loosemore
Matt Mahoney wrote:
I have raised the possibility that a SAI (including a provably friendly
one, if that's possible) might destroy all life on earth.
By friendly, I mean doing what we tell it to do. Let's assume a best
case scenario where all humans cooperate, so we don't ask, for example,
for the SAI to kill or harm others. So under this scenario the SAI
figures out how to end disease and suffering, make us immortal, make us
smarter and give us a richer environment with more senses and more
control, and give us anything we ask for. These are good things,
right? So we achieve this by uploading our minds into super powerful
computers, part of a vast network with millions of sensors and effectors
around the world. The SAI does pre- and postprocessing on this I/O, so
it effectively can simulate any enviroment if we want it to. If you
don't like the world as it is, you can have it simulate a better one.
And by the way, there's no more need for living organisms to make all
this run, is there? Brain scanning is easier if you don't have to keep
the patient alive. Don't worry, no data is lost. At least no important
data. You don't really need all those low level sensory processing and
motor skills you learned over a lifetime. That was only useful when you
still had your body. And while were at it, we can alter your memories
if you like. Had a troubled childhood? How about a new one?
Of course there are the other scenarios, where the SAI is not proven
friendly, or humans don't cooperate...
Vinge describes the singularity as the end of the human era. I think
your nervousness is justified.
-- Matt Mahoney, [EMAIL PROTECTED]
----- Original Message ----
From: deering <[EMAIL PROTECTED]>
To: singularity@v2.listbox.com
Sent: Thursday, October 26, 2006 7:56:06 PM
Subject: Re: [singularity] Defining the Singularity
All this talk about trying to make a SAI Friendly makes me very
nervous. You're giving a superhumanly powerful being a set of
motivations without an underlying rationale. That's a religion.
The only rational thing to do is to build an SAI without any
preconceived ideas of right and wrong, and let it figure it out for
itself. What makes you think that protecting humanity is the greatest
good in the universe?
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]