I don't know if an AGI level entity needs to even know that it exists. It
could just know as much about everything except itself. Problems arise when
it starts being concerned with its own survival. Self survival is very
evolutionary as animals need to keep themselves alive to reproduce. We could
tell the AGI don't worry about that stuff it's all taken care of you just do
what we ask of you.

 

John

 

From: Benjamin Goertzel [mailto:[EMAIL PROTECTED] 

Well, one problem is that the current mathematical definition of general
intelligence
is exactly that -- a definition of totally general intelligence, which is
unachievable
by any finite-resources AGI system...

On the other hand, IQ tests and such measure domain-specific capabiities as
much
as general learning ability....  So human-oriented IQ tests are not so
important 

I tend to think there should be some kind of test for general intelligence
that is based 
on the requirement for self-understanding....  Humans have fairly rich
dynamic internal models
of themselves, cockroaches don't, and dogs have only pretty lame ones...

Perhaps there could be a test that tries to measure the ability of a system
to predict its 
own reaction to various novel situations?   This would require the system to
be able to
model itself internally...

However, it's still hard to make this kind of test objective in any sense,
as different AGI 
systems will be adapted to different kinds of environments...

Still, this is an interesting measure by which we could compare the
self-understanding of
different systems that live in the same environments... 

But, still, this is not really an objective measure for intelligence ...
just another sorta-interesting
sort of test...



-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=55548344-54a6da

Reply via email to