A difficulty that I think few people have ever addressed in a general
context is what I would term 'generative power'. This is in contrast
to learning ability: It is technically quite easy to create a system
that can learn anything you like, so long as you know exactly what
it's supposed to learn! No, the problem is in my view how to solve
complex problems with many seemingly plausible solutions.

Lets take the problem of how to find out someone's opinion on a
delicate subject. A sufficiently knowledgeable AI might have all the
data on psychology and language to recognize a correct form reasonably
quickly, but be totally unable to compose a question itself due to the
combinatorial complexity. This can happen very easy in a GOFAI system,
where you end up with rigid heuristics about which other rigid
heuristics to use in which contexts, or even more layers of
indirection. The trouble is that these heuristics are practically
impossible to acquire in such a system; they have to be taught. Added
to that, there are always edge cases which bring the whole house of
cards tumbling down. Probabilistic approaches would perhaps work a
great deal better: they won't often find the absolute best case, but
they'll usually come close. It seems to be how we do it, at any rate,
and explains a great deal of the creativity which we can see
throughout the higher orders of life.

Another thing which is almost trivial is embodyment. If the AI has no
sense of its physical location in space it will have a massive
disadvantage. If on the other hand it has a constant stream of data
from its location it will rapidly acquire the fundamental grounding
for the most basic concepts common to all mammalian life (if it's any
good!)

--Nate

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to