YKY,

Cyc has been around a long time with a large amount of financial,
computational and human resources invested into it.

Why do you think it will succeed in the next 5 years when it hasn't
for the last 20+?

What novel ideas do you intend to introduce into Cyc that will make it
suddenly begin to think and understand?

You say

The Cyc route actually bypasses experiential learning because it allows us
to directly enter commonsense knowledge into its KB.  That is perhaps the
most significant difference between these 2 approaches.

In fact, you can directly enter knowledge into Novamente in logic
form, just like Cyc.  We could load the Cyc KB into Novamente next
week if we wanted to.

The problem is that neither Cyc, nor NM, nor any other system is going
to be able to do any interesting learning and thinking based solely on
this kind of formal, abstracted "quasi common sense" knowledge.

I think this is rather amply demonstrated by the long and profoundly
uninspiring history of Cyc and related systems.

I conjecture that for an AI system to make good use of a Cyc type KB,
it must have a reasonable level of experiential grounding for many of
the concepts in the KB.

-- Ben

On 4/12/07, YKY (Yan King Yin) <[EMAIL PROTECTED]> wrote:

On 4/6/07, Benjamin Goertzel <[EMAIL PROTECTED]> wrote:
> > Ben:  Are you interested in translating LRRH into Novamente's KR, as a
demo?
>
> Not really...
>
> Here's the thing: Novamente's KR is very flexible...
>
> So, one could translate LRRH into Novamente-ese in a way that would sorta
resemble "Cyc plus probabilities, with some higher-order functions and
pattern-intensities too"
>
> But, that wouldn't be likely to closely resemble the way LRRH would wind
up being represented in the mind of a Novamente instance that really
understood the story.
>
> So the exercise of explicitly writing LRRH in Novamente's KR would likely
wind up being not only pointless, but actively misleading ;-)
>
> While I do think that a probabilistic logic based KR (as NM uses) is a
good choice, I don't think that the compact logical representation a human
would use to explicitly represent a story like LRRH, is really the right
kind of representation for deep internal use by an AGI system.  An AGI's
internal representation of a story like this may be logical in form, but is
going to consist of a very large number of uncertain, contextual
relationships, along with some of the crisper and more encapsulated ones
like those a human would formulate if carrying out the exercise of encoding
LRRH in logic.
>
> It is for this reason, among others, that I find Cyc-type AI systems a bit
misguided
> (another main reason is their lack of effective learning algorithms; and
then there's the fact that the absence of perceptual-motor grounding makes
it difficult for a useful self-model to emerge; etc. etc.)


Hi Ben,

I understand the current situation with Novamente.  It seems that one
fundamental difference between Cyc and Novamente is that Cyc is focused on
the linguistic / symbolic level whereas Novamente is focused on sensory /
experiential learning.

My current intuition is that Cyc's route may achieve "a certain level of
intelligence" *sooner*.  (Although the work done with sensory-based AGI
would probably still be useful.)  This may sound kind of vague, but my
intuition is that if we invest on a Cyc-like AGI for 5 years, it may be able
to converse with humans in a natural language and answer some commonsense
queries (which, the current Cyc actually is somewhat capable of).  But if
you invest 5 years in a sensory-based AGI, the resulting AGI baby may
be still at the level of a 3-5 years old human.  It seems that much of your
work may be wasted on dealing with sensory processing and experiential
learning, the latter is particularly inefficient.

The Cyc route actually bypasses experiential learning because it allows us
to directly enter commonsense knowledge into its KB.  That is perhaps the
most significant difference between these 2 approaches.

YKY ________________________________
 This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936

Reply via email to