Re: [agi] knowledge representation, Cyc

2007-04-17 Thread James Ratcliff
  Hi Ben,
  
 I understand the current situation with Novamente.  It seems that one 
fundamental difference between Cyc and Novamente is that Cyc is focused on the 
linguistic / symbolic level whereas Novamente is focused on sensory / 
experiential learning. 
  
 My current intuition is that Cyc's route may achieve "a certain level of 
intelligence" *sooner*.  (Although the work done with sensory-based AGI would 
probably still be useful.)  This may sound kind of vague, but my intuition is 
that if we invest on a Cyc-like AGI for 5 years, it may be able to converse 
with humans in a natural language and answer some commonsense queries (which, 
the current Cyc actually is somewhat capable of).  But if you invest 5 years in 
a sensory-based AGI, the resulting AGI baby may be still at the level of a 3-5 
years old human.  It seems that much of your work may be wasted on dealing with 
sensory processing and experiential learning, the latter is particularly 
inefficient. 
  
 The Cyc route actually bypasses experiential learning because it allows us to 
directly enter commonsense knowledge into its KB.  That is perhaps the most 
significant difference between these 2 approaches. 
  
 YKY
My first thought and gut-reaction was "Yeah" record all knowledge about all 
objects and things, and then you can build an AI on top of that.  The problem 
is you need a perfectly compiled, ready to use DB of information.  
  How do you tell its perfect?  Use it, test it, in all situations.  This is 
where Cyc fails most horribly unfortunatly :{
There is no real way to test all information in all situations, so any amount 
of Cyc's information is incorrect for some usage, so when the AI tries it it 
fails and must correct the information in some way.
  Now, I think we need a small core of knowledge, and abilities, wherin we can 
get a AI up on its feet and able to interact as soon as possible. Does it know 
everything, can can it do everything, no, but if you have a hosue robot, it 
really doesnt need to know about Paris, and cities in France, until it has 
reason to know it.
  If it has a rich ability to interact with its users, to ask questions, to get 
data from a simple source like Wikipedia or direct experience, then it has the 
ability of humans to learn.  

James Ratcliff



___
James Ratcliff - http://falazar.com
Looking for something...
   
-
Ahhh...imagining that irresistible "new car" smell?
 Check outnew cars at Yahoo! Autos.

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936

Re: [agi] knowledge representation, Cyc

2007-04-12 Thread Benjamin Goertzel

YKY,

Cyc has been around a long time with a large amount of financial,
computational and human resources invested into it.

Why do you think it will succeed in the next 5 years when it hasn't
for the last 20+?

What novel ideas do you intend to introduce into Cyc that will make it
suddenly begin to think and understand?

You say


The Cyc route actually bypasses experiential learning because it allows us
to directly enter commonsense knowledge into its KB.  That is perhaps the
most significant difference between these 2 approaches.


In fact, you can directly enter knowledge into Novamente in logic
form, just like Cyc.  We could load the Cyc KB into Novamente next
week if we wanted to.

The problem is that neither Cyc, nor NM, nor any other system is going
to be able to do any interesting learning and thinking based solely on
this kind of formal, abstracted "quasi common sense" knowledge.

I think this is rather amply demonstrated by the long and profoundly
uninspiring history of Cyc and related systems.

I conjecture that for an AI system to make good use of a Cyc type KB,
it must have a reasonable level of experiential grounding for many of
the concepts in the KB.

-- Ben

On 4/12/07, YKY (Yan King Yin) <[EMAIL PROTECTED]> wrote:


On 4/6/07, Benjamin Goertzel <[EMAIL PROTECTED]> wrote:
> > Ben:  Are you interested in translating LRRH into Novamente's KR, as a
demo?
>
> Not really...
>
> Here's the thing: Novamente's KR is very flexible...
>
> So, one could translate LRRH into Novamente-ese in a way that would sorta
resemble "Cyc plus probabilities, with some higher-order functions and
pattern-intensities too"
>
> But, that wouldn't be likely to closely resemble the way LRRH would wind
up being represented in the mind of a Novamente instance that really
understood the story.
>
> So the exercise of explicitly writing LRRH in Novamente's KR would likely
wind up being not only pointless, but actively misleading ;-)
>
> While I do think that a probabilistic logic based KR (as NM uses) is a
good choice, I don't think that the compact logical representation a human
would use to explicitly represent a story like LRRH, is really the right
kind of representation for deep internal use by an AGI system.  An AGI's
internal representation of a story like this may be logical in form, but is
going to consist of a very large number of uncertain, contextual
relationships, along with some of the crisper and more encapsulated ones
like those a human would formulate if carrying out the exercise of encoding
LRRH in logic.
>
> It is for this reason, among others, that I find Cyc-type AI systems a bit
misguided
> (another main reason is their lack of effective learning algorithms; and
then there's the fact that the absence of perceptual-motor grounding makes
it difficult for a useful self-model to emerge; etc. etc.)


Hi Ben,

I understand the current situation with Novamente.  It seems that one
fundamental difference between Cyc and Novamente is that Cyc is focused on
the linguistic / symbolic level whereas Novamente is focused on sensory /
experiential learning.

My current intuition is that Cyc's route may achieve "a certain level of
intelligence" *sooner*.  (Although the work done with sensory-based AGI
would probably still be useful.)  This may sound kind of vague, but my
intuition is that if we invest on a Cyc-like AGI for 5 years, it may be able
to converse with humans in a natural language and answer some commonsense
queries (which, the current Cyc actually is somewhat capable of).  But if
you invest 5 years in a sensory-based AGI, the resulting AGI baby may
be still at the level of a 3-5 years old human.  It seems that much of your
work may be wasted on dealing with sensory processing and experiential
learning, the latter is particularly inefficient.

The Cyc route actually bypasses experiential learning because it allows us
to directly enter commonsense knowledge into its KB.  That is perhaps the
most significant difference between these 2 approaches.

YKY 
 This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?&;


-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936


Re: [agi] knowledge representation, Cyc

2007-04-12 Thread YKY (Yan King Yin)

On 4/6/07, Benjamin Goertzel <[EMAIL PROTECTED]> wrote:

> Ben:  Are you interested in translating LRRH into Novamente's KR, as a

demo?


Not really...

Here's the thing: Novamente's KR is very flexible...

So, one could translate LRRH into Novamente-ese in a way that would sorta

resemble "Cyc plus probabilities, with some higher-order functions and
pattern-intensities too"


But, that wouldn't be likely to closely resemble the way LRRH would wind

up being represented in the mind of a Novamente instance that really
understood the story.


So the exercise of explicitly writing LRRH in Novamente's KR would likely

wind up being not only pointless, but actively misleading ;-)


While I do think that a probabilistic logic based KR (as NM uses) is a

good choice, I don't think that the compact logical representation a human
would use to explicitly represent a story like LRRH, is really the right
kind of representation for deep internal use by an AGI system.  An AGI's
internal representation of a story like this may be logical in form, but is
going to consist of a very large number of uncertain, contextual
relationships, along with some of the crisper and more encapsulated ones
like those a human would formulate if carrying out the exercise of encoding
LRRH in logic.


It is for this reason, among others, that I find Cyc-type AI systems a bit

misguided

(another main reason is their lack of effective learning algorithms; and

then there's the fact that the absence of perceptual-motor grounding makes
it difficult for a useful self-model to emerge; etc. etc.)

Hi Ben,

I understand the current situation with Novamente.  It seems that one
fundamental difference between Cyc and Novamente is that Cyc is focused on
the linguistic / symbolic level whereas Novamente is focused on sensory /
experiential learning.

My current intuition is that Cyc's route may achieve "a certain level of
intelligence" *sooner*.  (Although the work done with sensory-based AGI
would probably still be useful.)  This may sound kind of vague, but my
intuition is that if we invest on a Cyc-like AGI for 5 years, it may be able
to converse with humans in a natural language and answer some commonsense
queries (which, the current Cyc actually is somewhat capable of).  But if
you invest 5 years in a sensory-based AGI, the resulting AGI baby may
be still at the level of a 3-5 years old human.  It seems that much of your
work may be wasted on dealing with sensory processing and experiential
learning, the latter is particularly inefficient.

The Cyc route actually bypasses experiential learning because it allows us
to directly enter commonsense knowledge into its KB.  That is perhaps the
most significant difference between these 2 approaches.

YKY

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=231415&user_secret=fabd7936

Re: [agi] knowledge representation, Cyc

2007-04-05 Thread Benjamin Goertzel



Ben:  Are you interested in translating LRRH into Novamente's KR, as a
demo?




Not really...

Here's the thing: Novamente's KR is very flexible...

So, one could translate LRRH into Novamente-ese in a way that would sorta
resemble "Cyc plus probabilities, with some higher-order functions and
pattern-intensities too"

But, that wouldn't be likely to closely resemble the way LRRH would wind up
being represented in the mind of a Novamente instance that really understood
the story.

So the exercise of explicitly writing LRRH in Novamente's KR would likely
wind up being not only pointless, but actively misleading ;-)

While I do think that a probabilistic logic based KR (as NM uses) is a good
choice, I don't think that the compact logical representation a human would
use to explicitly represent a story like LRRH, is really the right kind of
representation for deep internal use by an AGI system.  An AGI's internal
representation of a story like this may be logical in form, but is going to
consist of a very large number of uncertain, contextual relationships, along
with some of the crisper and more encapsulated ones like those a human would
formulate if carrying out the exercise of encoding LRRH in logic.

It is for this reason, among others, that I find Cyc-type AI systems a bit
misguided
(another main reason is their lack of effective learning algorithms; and
then there's the fact that the absence of perceptual-motor grounding makes
it difficult for a useful self-model to emerge; etc. etc.)

-- Ben G


-- Ben

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: [agi] knowledge representation, Cyc

2007-04-05 Thread Philip Goetz

On 3/29/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:


Of course, that is a good way to learn Cyc.  My concern is that Cyc has been
trying for years without success to build a natural language interface.  There
is a huge mismatch between structured knowledge representations like Cyc and
the way that children actually learn language.


Huger than huge.

Cyc, like a lot of AI, is designed following the idea that you can
design a knowledge representation for your data, and then you can
design an architecture that says how to process that representation.

In the human brain, the representation IS the architecture.  The
representation of a concept like "hood" is a pattern of activation in
neurons in different regions of the brain.  The set of neurons that
are activated to represent "hood" is a function of how these regions
are connected together, which ALSO determines the order of events when
neurons are activated and ideas are processed.  Architecture and
representation are identical.

- Phil Goetz

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: [agi] knowledge representation, Cyc

2007-03-29 Thread Matt Mahoney

--- "YKY (Yan King Yin)" <[EMAIL PROTECTED]> wrote:

> On 3/30/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:
> > Wouldn't it save time in the long run to build a system that could
> translate
> > English into your KR?
> 
> Yes, that's the goal.  I'm just doing a human translation of the first
> paragraph or so, to get the feel of CycL.

Of course, that is a good way to learn Cyc.  My concern is that Cyc has been
trying for years without success to build a natural language interface.  There
is a huge mismatch between structured knowledge representations like Cyc and
the way that children actually learn language.

> It can also be compared with Novamente's version.  I think world-wide there
> are only about 3-5 KR schemes capable of representing such stories
> adequately.

Would one of these representations be the original English text?


-- Matt Mahoney, [EMAIL PROTECTED]

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: [agi] knowledge representation, Cyc

2007-03-29 Thread Mark Waser
>> It can also be compared with Novamente's version.  

In one respect, Novamente's version is analogous to assembly language.  It's at 
a very detailed level and can represent anything but you're going to spend a 
lot of time and verbosity doing it.

In another respect, Novamente's version can't represent anything at all without 
either a) a standardized dictionary that is shared between KRs or b) a ton of 
rules like CycL has that are actually definitions in disguise (i.e. a 
dictionary).

A fully loaded Novamente with a dictionary and Cyc's rules would be awesome and 
you could probably do close to an auto-translation -- but the question is how 
you *consistently* load Novamente to that point.

My personal opinion and approach is that Novamente gives you too much 
flexibility and not enough structure (although, of course, the flexibility 
allows you to build the structure if you are so inclined).  Your mileage may 
vary since the proper point on the flexibility vs. structure trade-off spectrum 
is a serious guess on anyone's part and I'd value Ben's guess over anyone 
else's (except my own :-).

Anyways, my point is that you're not going to be (effectively) translating 
Little Red Riding Hood into Novamente anytime in the near future.

>> I think world-wide there are only about 3-5 KR schemes capable of 
>> representing such stories adequately. 

Could you name which ones you believe ARE capable?  (and I would love to see 
your translation of LRR into CycL -- or any other KR)
  - Original Message - 
  From: YKY (Yan King Yin) 
  To: agi@v2.listbox.com 
  Sent: Thursday, March 29, 2007 4:35 PM
  Subject: Re: [agi] knowledge representation, Cyc





  On 3/30/07, Matt Mahoney <[EMAIL PROTECTED]> wrote: 
  > Wouldn't it save time in the long run to build a system that could 
translate 
  > English into your KR?


  Yes, that's the goal.  I'm just doing a human translation of the first 
paragraph or so, to get the feel of CycL.

  It can also be compared with Novamente's version.  I think world-wide there 
are only about 3-5 KR schemes capable of representing such stories adequately. 

  YKY

--
  This list is sponsored by AGIRI: http://www.agiri.org/email
  To unsubscribe or change your options, please go to:
  http://v2.listbox.com/member/?list_id=303 

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: [agi] knowledge representation, Cyc

2007-03-29 Thread YKY (Yan King Yin)

On 3/30/07, Matt Mahoney <[EMAIL PROTECTED]> wrote:

Wouldn't it save time in the long run to build a system that could

translate

English into your KR?


Yes, that's the goal.  I'm just doing a human translation of the first
paragraph or so, to get the feel of CycL.

It can also be compared with Novamente's version.  I think world-wide there
are only about 3-5 KR schemes capable of representing such stories
adequately.

YKY

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


Re: [agi] knowledge representation, Cyc

2007-03-29 Thread Matt Mahoney

--- "YKY (Yan King Yin)" <[EMAIL PROTECTED]> wrote:

> I just talked to some Cyc folks, and they assured me that CycL is adequate
> to represent entire stories like Little Red Riding Hood.
> 
> The AGI framework has to operate on a knowledge representation language, and
> building that language is not a programming task, rather a ontology
> engineering task, which I'm not very familiar with.  I guess we should not
> underestimate the amount of work required for the KR scheme.  If we use CycL
> we may save a lot of time.
> 
> I may try to translate LRRH into CycL to see if it is too cumbersome or
> what.

Wouldn't it save time in the long run to build a system that could translate
English into your KR?


-- Matt Mahoney, [EMAIL PROTECTED]

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303


[agi] knowledge representation, Cyc

2007-03-29 Thread YKY (Yan King Yin)

I just talked to some Cyc folks, and they assured me that CycL is adequate
to represent entire stories like Little Red Riding Hood.

The AGI framework has to operate on a knowledge representation language, and
building that language is not a programming task, rather a ontology
engineering task, which I'm not very familiar with.  I guess we should not
underestimate the amount of work required for the KR scheme.  If we use CycL
we may save a lot of time.

I may try to translate LRRH into CycL to see if it is too cumbersome or
what.

Ben:  Are you interested in translating LRRH into Novamente's KR, as a demo?

YKY

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?list_id=303