On May 28, 2007, at 3:32 PM, Russell Wallace wrote:
On 5/28/07, Shane Legg <[EMAIL PROTECTED]> wrote:
If one accepts that there is, then the question becomes:
Where should we put a super human intelligent machine
on the list? If it's not at the top, then where is it and why?
I don't claim to have answers to any of these questions,
I'm just wondering what other people's thoughts are.
Well, my two cents worth: I don't believe in superhuman intelligent
machines of the kind you presumably mean here, but my position is
that humanity (by which I include potential future forms such as
uploaded humans) is my kind and I intend to defend it to the best
of my ability; in the extremely unlikely event that I could save
human lives only by destroying a machine, I would do so,
irrespective of how intelligent the machine was.
What do you mean that you don't believe in superhuman intelligent
machines? You don't believe they are possible ever? So your are
happily provincial in this respect. I suspect that most of us are.
However, is this any more than descriptive of what we would probably
do given such a choice?
However, Richard is correct that this kind of scenario simply isn't
relevant outside science fiction. Intelligence simply isn't the
sort of thing it would need to be for "superintelligent machines vs
humans" stories to happen in reality.
Why not? We are more intelligent than most creatures on earth. Many
of them didn't faire so well with us around. There is no reason to
believe our current intelligence or even our possibly augmented
intelligence forms the cap on how much intelligence is possible.
- s
-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=7d7fb4d8