On Fri, Sep 19, 2008 at 6:57 AM, Ben Goertzel <[EMAIL PROTECTED]> wrote:
> general intelligence at the human level

I hear you say these words a lot.  I think, by using the word "level",
you're trying to say something different to "general intelligence just
like humans have" but I'm not sure everyone else reads it that way.
Can you clarify?

Humans have all these interests that, although they might be
interesting to study with AGI, I'm not terribly interested in putting
in an AGI that I put to work.  I don't need an AGI that cries for its
mother, or thinks about eating, or yearns for freedom and so I simply
won't teach it these things.  If, by some fortuitous accident, it
happens to develop any of these concepts, or any other concepts that I
deem useless for the tasks I set it, I'll expect them to be quickly
purged from its limited memory space to make room for concepts that
are useful.  As such, I can imagine an AGI having a "human level"
intelligence that is very different to a "human-like" intelligence.

This is not to say that creating an AGI with human-like intelligence
is necessarily a bad thing.  Some people want to create simulated
humans, and that's interesting too.. just not as interesting to me.

Trent


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=114414975-3c8e69
Powered by Listbox: http://www.listbox.com

Reply via email to