--- Stathis Papaioannou <[EMAIL PROTECTED]> wrote:

> On 02/07/07, Tom McCabe <[EMAIL PROTECTED]>
> wrote:
> 
> > The AGI doesn't care what any human, human
> committee,
> > or human government thinks; it simply follows its
> own
> > internal rules.
> 
> Sure, but its internal rules and goals might be
> specified in such a
> way as to make it refrain from acting in a
> particular way. For
> example, if it has as its most important goal
> obeying the commands of
> humans, that's what it will do.

Yup. For example, if a human said "I want a banana",
the fastest way for the AGI to get the human banana
may be to detonate a kilogram of RDX, launching the
banana into the human at Mach 7. This is clearly not
going to work.

> It won't try to find
> some way out of
> it, because that assumes it has some other goal
> which trumps obeying
> humans.

The AGI will follow its goals; the problem isn't that
it will seek some way to avoid following its goals,
but that its goals do not match up exactly with the
enormously complicated range of human desires.

> If it is forced to randomly change its goals
> at regular
> intervals then it might become disobedient, but not
> otherwise.

How would you force a superintelligent AGI to change
its goals?

> 
> -- 
> Stathis Papaioannou
> 
> -----
> This list is sponsored by AGIRI:
> http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
>
http://v2.listbox.com/member/?&;
> 

 - Tom


       
____________________________________________________________________________________
Choose the right car based on your needs.  Check out Yahoo! Autos new Car 
Finder tool.
http://autos.yahoo.com/carfinder/

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=7d7fb4d8

Reply via email to