--- Stathis Papaioannou <[EMAIL PROTECTED]> wrote:

> On 01/07/07, Tom McCabe <[EMAIL PROTECTED]>
> wrote:
> 
> > An excellent analogy to a superintelligent AGI is
> a
> > really good chess-playing computer program. The
> > computer program doesn't realize you're there, it
> > doesn't know you're human, it doesn't even know
> what
> > the heck a human is, and it would gladly pump you
> full
> > of gamma radiation if it made you a worse player.
> > Nevertheless, it is still intelligent, more so
> than
> > you are: it can foresee everything you try and do,
> and
> > can invent new strategies and use them to come out
> of
> > nowhere and beat you by surprise. Trying to
> deprive a
> > superintelligent AI of free will is as absurd as
> Gary
> > Kasparov trying to deny Deep Blue free will within
> the
> > context of the gameboard.
> 
> But Deep Blue wouldn't try to poison Kasparov in
> order to win the
> game. This isn't because it isn't intelligent enough
> to figure out
> that disabling your opponent would be helpful, it's
> because the
> problem it is applying its intelligence to is
> winning according to the
> formal rules of chess.

Exactly. The formal rules of chess say stuff about
where to put pawns and knights; they're analogous to
the laws of physics. They don't say anything about
poisoning the opposing player. If you try to build in
a rule about poisoning the player, the chess program
will shoot him; if you build in a rule against killing
him, the chess program will give him a hallucinogen;
if you build in a rule against giving him drugs, the
chess program will hijack the room wall and turn it
into a realistic 3D display of what would happen if a
truck smashed into the room by accident. This approach
will never work- you're pitting your intelligence at
designing rules against the program's intelligence at
evading them, and it's smarter than you are.

 - Tom

> Winning at any cost might
> look like the same
> problem to us vague humans, but it isn't.
> 
> 
> 
> -- 
> Stathis Papaioannou
> 
> -----
> This list is sponsored by AGIRI:
> http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
>
http://v2.listbox.com/member/?&;
> 

 - Tom


       
____________________________________________________________________________________
Get the free Yahoo! toolbar and rest assured with the added security of spyware 
protection.
http://new.toolbar.yahoo.com/toolbar/features/norton/index.php

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=4007604&user_secret=7d7fb4d8

Reply via email to