--- Alan Grimes <[EMAIL PROTECTED]> wrote:

> ;)
> 
> Seriously now, Why do people insist there is a
> necessary connection (as
> in A implies B) between the singularity and brain
> uploading?
> 
> Why is it that anyone who thinks "the singularity
> happens and most
> people remain humanoid" is automatically branded a
> luddite?

I think most people are likely to remain humanoid, at
least initially, and I've never been branded a
Luddite. A utility function really doesn't describe
humans very well- there's no a priori reason why it
should. A utility function, if given the chance, will
immediately set the universe to whichever state has
the highest utility. Humans don't seem to work like
that, and probably wouldn't at least in the short term
even if they were given the opportunity. The natural
behavior for humans is a continuous, human-time-scale
progression; eg, you pick up a rock, lug the rock some
distance, and set it down, each step being continuous.
This is inherent in evolution, because that's what
life in the ancestral environment is like. You could
model a human mind as a continuous curve, or a
probability distribution over curves, in some
billion-dimensional space (including the time
dimension). So while a human may desire a hundred
perfect mates, a luxury house, millions of dollars,
and all that, the natural, comfortable state is to
have these things slowly accumulate one at a time
rather than being dropped from the sky. Ironically,
when the world is changing really fast, one of the
first responses is to *slow down* so you can figure
out what's going on and assess the situation. I do
*not* think that most people will want to be suddenly
thrust into new and unfamiliar situations, even
enticing ones, the minute the FAI goes online.

> In such a scenereo, it is quite possible to consider
> ultratechnologies,
> vast life extension, space exploitation, body
> customization,
> intelligence enhancement, even amazingly radical
> forms of intelligence
> enhancement, without spending a minute quibbiling
> over pattern identity
> theory.

If you want to build an FAI, you need to know how to
define a sentient creature, or at least know how to
let the FAI figure it out, because you have to protect
the rights of uploads.

> I want to find people who are as entheuseastic as I
> am about such a
> future, people who are willing to spend hours each
> day trying to learn
> the skills required to develop a superhuman AI

If you want to develop all the skills necessary to
build an FAI, go ahead, there's nothing stopping you.
Wanting everyone *else* to do it is as patently
unrealistic as telling everyone you want them to get a
PhD.

> instead of wasting all
> that mental energy on the question of how many
> angels can dance on the
> head of a pin.
> 
> looks like I'm going to have to have to resort to my
> collection of
> reducto ad absurdum arguments to get this list back
> on topic... =\
> 
> -- 
> Opera: Sing it loud! :o(  )>-<

Weren't you just complaining about discussing
irrelevant topics? :)

 - Tom

> -----
> This list is sponsored by AGIRI:
> http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
>
http://v2.listbox.com/member/?&;
> 



       
____________________________________________________________________________________
Be a better Heartthrob. Get better relationship answers from someone who knows. 
Yahoo! Answers - Check it out. 
http://answers.yahoo.com/dir/?link=list&sid=396545433

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=7d7fb4d8

Reply via email to