Ben:  Are you interested in translating LRRH into Novamente's KR, as a
demo?



Not really...

Here's the thing: Novamente's KR is very flexible...

So, one could translate LRRH into Novamente-ese in a way that would sorta
resemble "Cyc plus probabilities, with some higher-order functions and
pattern-intensities too"

But, that wouldn't be likely to closely resemble the way LRRH would wind up
being represented in the mind of a Novamente instance that really understood
the story.

So the exercise of explicitly writing LRRH in Novamente's KR would likely
wind up being not only pointless, but actively misleading ;-)

While I do think that a probabilistic logic based KR (as NM uses) is a good
choice, I don't think that the compact logical representation a human would
use to explicitly represent a story like LRRH, is really the right kind of
representation for deep internal use by an AGI system.  An AGI's internal
representation of a story like this may be logical in form, but is going to
consist of a very large number of uncertain, contextual relationships, along
with some of the crisper and more encapsulated ones like those a human would
formulate if carrying out the exercise of encoding LRRH in logic.

It is for this reason, among others, that I find Cyc-type AI systems a bit
misguided
(another main reason is their lack of effective learning algorithms; and
then there's the fact that the absence of perceptual-motor grounding makes
it difficult for a useful self-model to emerge; etc. etc.)

-- Ben G


-- Ben

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303

Reply via email to