Matt, I really hope NARS can be simplified, but until you give me the details, such as how to calculate the truth value in your "converse" rule, I cannot see how you can do the same things with a simpler design.
NARS has this conversion rule, which, with the deduction rule, can "replace" induction/abduction, just as you suggested. However, conclusions produced in this way usually have lower confidence than those directly generated by induction/abduction, so this trick is not that useful in NARS. This result is discussed in http://www.cogsci.indiana.edu/pub/wang.inheritance_nal.ps , page 27. For your original claim that "The brain does not implement formal logic", my brief answers are: (1) So what? Who said AI must duplicate the brain? Just because we cannot image another possibility? (2) In a broad sense, "formal logic" is nothing but "domain-independent and justifiable data manipulation schemes". I haven't seen any argument for why AI cannot be achieved by implementing that. After all, "formal logic" is not limited to "First-Order Predicate Calculus plus Model Theory". Pei On Sat, Sep 20, 2008 at 4:44 PM, Matt Mahoney <[EMAIL PROTECTED]> wrote: > --- On Fri, 9/19/08, Jan Klauck <[EMAIL PROTECTED]> wrote: > >> Formal logic doesn't scale up very well in humans. That's why this >> kind of reasoning is so unpopular. Our capacities are that >> small and we connect to other human entities for a kind of >> distributed problem solving. Logic is just a tool for us to >> communicate and reason systematically about problems we would >> mess up otherwise. > > Exactly. That is why I am critical of probabilistic or uncertain logic. > Humans are not very good at logic and arithmetic problems requiring long > sequences of steps, but duplicating these defects in machines does not help. > It does not solve the problem of translating natural language into formal > language and back. When we need to solve such a problem, we use pencil and > paper, or a calculator, or we write a program. The problem for AI is to > convert natural language to formal language or a program and back. The formal > reasoning we already know how to do. > > Even though a language model is probabilistic, probabilistic logic is not a > good fit. For example, in NARS we have deduction (P->Q, Q->R) => (P->R), > induction (P->Q, P->R) => (Q->R), and abduction (P->R, Q->R) => (P->Q). > Induction and abduction are not strictly true, of course, but in a > probabilistic logic we can assign them partial truth values. > > For language modeling, we can simplify the logic. If we accept the "converse" > rule (P->Q) => (Q->P) as partially true (if rain predicts clouds, then clouds > may predict rain), then we can derive induction and abduction from deduction > and converse. For induction, (P->Q, P->R) => (Q->P, P->R) => (Q->R). > Abduction is similar. Allowing converse, the statement (P->Q) is really a > fuzzy equivalence or association (P ~ Q), e.g. (rain ~ clouds). > > A language model is a set of associations between concepts. Language learning > consists of two operations carried out on a massively parallel scale: forming > associations and forming new concepts by clustering in context space. An > example of the latter is: > > the dog is > the cat is > the house is > ... > the (noun) is > > So if we read "the glorp is" we learn that "glorp" is a noun. Likewise, we > learn something of its meaning from its more distant context, e.g. "the glorp > is eating my flowers". We do this by the transitive property of association, > e.g. (glorp ~ eating flowers ~ rabbit). > > This is not to say NARS or other systems are wrong, but rather that they have > more capability than we need to solve reasoning in AI. Whether the extra > capability helps or not is something that requires experimental verification. > > -- Matt Mahoney, [EMAIL PROTECTED] > > > > ------------------------------------------- > agi > Archives: https://www.listbox.com/member/archive/303/=now > RSS Feed: https://www.listbox.com/member/archive/rss/303/ > Modify Your Subscription: https://www.listbox.com/member/?& > Powered by Listbox: http://www.listbox.com > ------------------------------------------- agi Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/ Modify Your Subscription: https://www.listbox.com/member/?member_id=8660244&id_secret=114414975-3c8e69 Powered by Listbox: http://www.listbox.com