- Original Message
From: Richard Loosemore <[EMAIL PROTECTED]>
To: agi@v2.listbox.com
Sent: Tuesday, October 24, 2006 12:37:16 PM
Subject: Re: [agi] Language modeling
>Matt Mahoney wrote:
>> Converting natural language to a formal representation requires language
>
On 10/23/06, Matt Mahoney <[EMAIL PROTECTED]> wrote:
I am interested in identifying barriers to language modeling and how to
overcome them.
I have no doubt that probabilistic models such as NARS and Novamente can
adequately represent human knowledge.
NARS isn't a probabilistic model, though
Matt Mahoney wrote:
I am interested in identifying barriers to language modeling and how to
overcome them.
I have no doubt that probabilistic models such as NARS and Novamente can
adequately represent human knowledge. Also, I have no doubt they can
learn e.g. relations such as "all frogs are g
Hi.
The state of the art in language modeling is at the level of simple sentences,
modeling syntax using n-grams (usually trigrams) or hidden Markov models ...
Just a remark: google recently made their up-to-5-grams available through LDC
http://googleresearch.blogspot.com/2006/08/all-our-n-gra
So my question is: what is needed to extend language models to the level of
compound sentences? More training data? Different training data? A new
theory of language acquisition? More hardware? How much?
What is needed is:
A better training approach, involving presentation of compound
se
I am interested in identifying barriers to language modeling and how to
overcome them.
I have no doubt that probabilistic models such as NARS and Novamente can
adequately represent human knowledge. Also, I have no doubt they can learn
e.g. relations such as "all frogs are green" from examples
In child development understanding seems to considerably precede the ability to articulate that understanding. Also development seems to generally move from highly abstract representations (stick men, smily suns) to more concrete adult-like ones.
On 23/10/06, justin corwin <[EMAIL PROTECTED]> wrot
I don't exactly have the same reaction, but I have some things to add
to the following exchange.
On 10/23/06, Richard Loosemore <[EMAIL PROTECTED]> wrote:
Matt Mahoney wrote:
> Children also learn language as a progression toward increasingly complex
patterns.
> - phonemes beginning at 2-4 week
YKY,Of course there is no a priori difference betw a set of nodes and links and a set of logical relationships...The question with your DB of facts about "love" and so forth is whether it captures the subtler uncertain patterns regarding love that we learn via experience My strong suspicion is
Ben Goertzel wrote:
>> The limited expressive scope of classic ANNs was actually essential
>> for getting relatively naïve and simplistic learning algorithms (e.g.
>> backprop, Hebbian learning) to produce useful solutions to an
>> interesting (if still fairly narrow) class of problems.
>
> Well,
On 10/23/06, Ben Goertzel <[EMAIL PROTECTED]> wrote: > > 2) the distinction between> 2a) using ungrounded formal symbols to pretend to represent knowledge,
e.g. an explicit labeled internal symbol for "cat", one for "give", etc.> 2b) having an AI system recognize patterns in its perception and a
On 10/23/06, Matt Mahoney <[EMAIL PROTECTED]> wrote: > [...]
> One aspect of NARS and many other structured or semi-structured knowledge representations that concerns me is the direct representation of concepts such as "is-a", equivalence, logic ("if-then", "and", "or", "not"), quantifiers ("all",
Hi, > For instance, this means that the "cat" concept may well not be
> expressed by a single "cat" term, but perhaps by a complex learned> (probabilistic) logical predicate.I don't think it's really useful to discuss representing word meaningswithout a sufficiently powerful notion of context (whic
On 23 Oct 2006 at 10:06, Ben Goertzel wrote:
> A very careful distinction needs to be drawn between:
>
> 1) the distinction between
> 1a) using probabilistic and formal-logical operators for representing
> knowledge
> 1b) using neural-net type operators (or other purely quantitative, non-
> logic
Matt Mahoney wrote:
My concern is that structured knowledge is inconsistent with the development of language in children. As I mentioned earlier, natural language has a structure that allows direct training in neural networks using fast, online algorithms such as perceptron learning, rather than
Hi Matt,Regarding logic-based knowledge representation and language/perceptual/action learning -- I understand the nature of your confusion, because the point you are confused on is exactly the biggest point of confusion for new members of the Novamente AI team.
A very careful distinction needs to
On 10/22/06, Matt Mahoney <[EMAIL PROTECTED]> wrote:
Also to Novamente, if I understand correctly. Terms are linked by a
probability and confidence. This seems to me to be an optimization of a neural
network or connectionist model, which is restricted to one number per link,
representing pr
- Original Message
From: Pei Wang <[EMAIL PROTECTED]>
To: agi@v2.listbox.com
Sent: Saturday, October 21, 2006 7:03:39 PM
Subject: Re: [agi] SOTA
>Well, in that sense NARS also has some resemblance to a neural
>network, as well as many other AI systems.
Also to Novamente
18 matches
Mail list logo