On Thu, Jan 09, 2003 at 11:24:14AM -0500, Ben Goertzel wrote:

> I think the issues that are problematic have to do with the emotional
> baggage that humans attach to the self/other distinction.  Which an AGI will
> most likely *not* have, due to its lack of human evolutionary wiring...

Simplistically, humans evolved from an amoeba.  No emotions as such, but
certainly behaviors designed for consumption, growth, reproduction, and world
domination.  We've gotten so complicated our behavior systems haven't totally
kept up, so we end up with things like Italy having negative population
growth, but generally we can be seen as colonies of colonies of amoebae.

One evolutionary (in the loose sense, not genetic algorithms) route of AI is
the command shell.  A program which waits around for a human request, then
hares off to fulfill it, then waits.

"No guarantees", but it seems plausible to me, just imaginging a series of
enhancements -- concepts, more concepts, knowledge of the world, some
curiosity so as to suggest things -- that we could wend our way up to
intelligence while never accidentally coming close to an aggressive
self-motivated system.

-xx- Damien X-) 

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]

Reply via email to