On 3/9/07, Jef Allbright <[EMAIL PROTECTED]> wrote:

We seem to have skipped over my point about intelligence being about
the encoding of regularities of effective interaction of an agent with
its environment, but perhaps that is now moot.

Now I see you use "information" to mean "regularities of effective
interaction of an agent with its environment", which I call "belief"
in my system.  Previously I assume by "information", you, like most
people in the field, mean an objective description of the environment,
which I think is a misconception.

I've been avoiding the term "encoding" because of its association with
the notion of "perfectly keeping the information". Especially, when
you contrast it with "processing" information, you seem to suggest
that an AI system should try to be a "faithful observer or recorder or
compressor", which I think is neither necessary nor possible.

Furthermore, "to summarize experience into beliefs" (my way to say
what I agree in your message) is only part of the function of
intelligence, but not the whole story. To overstress this aspect will
lead to incomplete models of intelligence. For example, factors like
goal, action, context, prediction, etc. are missing in this picture,
and the system looks like a passive observer of the environment.

Again, I probably misread your message, but the above is where my
initial negative response to your statement "Intelligence is about the
encoding, not the processing, of information" came from.

Pei

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to