I tend to agree with Damien.

I see no intrinsic reason why a service-driven AGI system could not become
as intelligent as humans and then more intelligent.

Suppose an AGI is given an initial motivational structure that rewards it
for

* serving people effectively
* discovering and creating new patterns

Why are these not challenges enough to spur the evolution of ever-increasing
intelligence within the AGI system?

Human intelligence arose largely out of the need to compete for survival,
but AGI's need not have the same motivators.

Survival is a suitably complex goal to drive the emergence of intelligence,
but so, perhaps, is service.

The service-oriented motivational structure is very close to what
Novamente's initial motivational structure will be.  Heck, the Novamente
software system right now -- 25% complete and non-generally-intelligent as
it is -- is *already* being used to serve people.  (Biomind LLC's alpha
Biomind Toolkit product, based on parts of the incomplete Novamente system,
will be done in March... its goal is to serve biologists with great data
analyses based on a broad integrative view of biological data... see
www.biomind.com).

- Ben Goertzel



> -----Original Message-----
> From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED]]On
> Behalf Of Damien Sullivan
> Sent: Thursday, January 09, 2003 10:27 PM
> To: [EMAIL PROTECTED]
> Subject: Re: [agi] Friendliness toward humans
>
>
> On Thu, Jan 09, 2003 at 10:23:07PM -0800, Alan Grimes wrote:
>
> > You _MIGHT_ be able to produce a proof of concept that way... However, a
> > practical working AI, such as the one which could help me design my my
> > next body, would need to be quite a bit more. =\
>
> Why?  Why should such a thing require replacing the original
> service-driven
> motivation with something riskier?
>
> -xx- Damien X-)
>
> -------
> To unsubscribe, change your address, or temporarily deactivate
> your subscription,
> please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]
>

-------
To unsubscribe, change your address, or temporarily deactivate your subscription, 
please go to http://v2.listbox.com/member/?[EMAIL PROTECTED]

Reply via email to