--- Ben Goertzel <[EMAIL PROTECTED]> wrote:

> >  As I am sure you are fully aware, you can't parse English without a
> knowledge
> >  of the meanings involved. ("The council opposed the demonstrators because
> >  they (feared/advocated) violence.") So how are you going to learn
> meanings
> >  before you can parse, or how are you going to parse before you learn
> >  meanings? They have to be interleaved in a non-trivial way.
> 
> True indeed!
> 
> Feeding all the ambiguous interpretations of a load of sentences into
> a probabilistic
> logic network, and letting them get resolved by reference to each
> other, is a sort of
> "search for the most likely solution of a huge system of simultaneous
> equations" ...
> i.e. one needs to let each, of a huge set of ambiguities, be resolved
> by the other ones...
> 
> This is not an easy problem, but it's not on the face of it unsolvable...
> 
> But I think the solution will be easier with info from direct
> experience to nudge the
> process in the right direction...

Children solve the problem by learning semantics before grammar.  Statistical
language models do the same thing.  Models like LSA and vector spaces (used
for search) do not depend on word order.



-- Matt Mahoney, [EMAIL PROTECTED]

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com

Reply via email to