Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-22 Thread Russell Wallace
On Mon, Sep 22, 2008 at 1:34 AM, Ben Goertzel [EMAIL PROTECTED] wrote: On the other hand, if intelligence is in large part a systems phenomenon, that has to do with the interconnection of reasonably-intelligent components in a reasonably-intelligent way (as I have argued in many prior

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-22 Thread Matt Mahoney
--- On Sun, 9/21/08, David Hart [EMAIL PROTECTED] wrote: On Mon, Sep 22, 2008 at 10:08 AM, Matt Mahoney [EMAIL PROTECTED] wrote: Training will be the overwhelming cost of AGI. Any language model improvement will help reduce this cost.  How do you figure that training will cost more than

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-21 Thread Eric Burton
Hmm. My bot mostly repeats what it hears. bot Monie: haha. r u a bot ? bot cyberbrain: not to mention that in a theory complex enough with a large enough number of parameters. one can interpret anything. even things that are completely physically inconsistent with each other. i suggest actually

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-21 Thread Eric Burton
Ok, most of its replies here seem to be based on the first word of what it's replying to. But it's really capable of more lateral connections. wijnand yeah i use it to add shortcuts for some menu functions i use a lot bot wijnand: TOMACCO!!! On 9/21/08, Eric Burton [EMAIL PROTECTED] wrote:

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-21 Thread Matt Mahoney
--- On Sat, 9/20/08, Mike Tintner [EMAIL PROTECTED] wrote: Matt: A more appropriate metaphor is that text compression is the altimeter by which we measure progress. (1) Matt, Now that sentence is a good example of general intelligence - forming a new connection between domains -

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-21 Thread Matt Mahoney
--- On Sat, 9/20/08, Ben Goertzel [EMAIL PROTECTED] wrote: A more appropriate metaphor is that text compression is the altimeter by which we measure progress. An extremely major problem with this idea is that, according to this altimeter, gzip is vastly more intelligent than a chimpanzee or a

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-21 Thread Ben Goertzel
Now if you want to compare gzip, a chimpanzee, and a 2 year old child using language prediction as your IQ test, then I would say that gzip falls in the middle. A chimpanzee has no language model, so it is lowest. A 2 year old child can identify word boundaries in continuous speech, can

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-21 Thread Matt Mahoney
--- On Sat, 9/20/08, Pei Wang [EMAIL PROTECTED] wrote: Matt, I really hope NARS can be simplified, but until you give me the details, such as how to calculate the truth value in your converse rule, I cannot see how you can do the same things with a simpler design. You're right. Given

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-21 Thread Matt Mahoney
--- On Sun, 9/21/08, Ben Goertzel [EMAIL PROTECTED] wrote: Hmmm I am pretty strongly skeptical of intelligence tests that do not measure the actual functionality of an AI system, but rather measure the theoretical capability of the structures or processes or data inside the system... The

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-21 Thread Ben Goertzel
I'm not building AGI. (That is a $1 quadrillion problem). I'm studying algorithms for learning language. Text compression is a useful tool for measuring progress (although not for vision). OK, but the focus of this list is supposed to be AGI, right ... so I suppose I should be forgiven for

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-21 Thread Matt Mahoney
--- On Sun, 9/21/08, Ben Goertzel [EMAIL PROTECTED] wrote: Text compression is IMHO a terrible way of measuring incremental progress toward AGI.  Of course it  may be very valuable for other purposes... It is a way to measure progress in language modeling, which is an important component of AGI

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-21 Thread Ben Goertzel
On Sun, Sep 21, 2008 at 8:08 PM, Matt Mahoney [EMAIL PROTECTED] wrote: --- On Sun, 9/21/08, Ben Goertzel [EMAIL PROTECTED] wrote: Text compression is IMHO a terrible way of measuring incremental progress toward AGI. Of course it may be very valuable for other purposes... It is a way to

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-21 Thread Matt Mahoney
--- On Sun, 9/21/08, Ben Goertzel [EMAIL PROTECTED] wrote: On Sun, Sep 21, 2008 at 8:08 PM, Matt Mahoney [EMAIL PROTECTED] wrote: --- On Sun, 9/21/08, Ben Goertzel [EMAIL PROTECTED] wrote: Text compression is IMHO a terrible way of measuring incremental progress toward AGI.  Of course it  may

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-21 Thread David Hart
On Mon, Sep 22, 2008 at 10:08 AM, Matt Mahoney [EMAIL PROTECTED] wrote: Training will be the overwhelming cost of AGI. Any language model improvement will help reduce this cost. How do you figure that training will cost more than designing, building and operating AGIs? Unlike a training a

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-20 Thread Jiri Jelinek
Matt, So, what formal language model can solve this problem? A FL that clearly separates basic semantic concepts like objects, attributes, time, space, actions, roles, relationships, etc + core subjective concepts e.g. want, need, feel, aware, believe, expect, unreal/fantasy. Humans have senses

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-20 Thread Russell Wallace
On Fri, Sep 19, 2008 at 11:46 PM, Matt Mahoney [EMAIL PROTECTED] wrote: So perhaps someone can explain why we need formal knowledge representations to reason in AI. Because the biggest open sub problem right now is dealing with procedural, as opposed to merely declarative or reflexive,

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-20 Thread Matt Mahoney
--- On Fri, 9/19/08, Jan Klauck [EMAIL PROTECTED] wrote: Formal logic doesn't scale up very well in humans. That's why this kind of reasoning is so unpopular. Our capacities are that small and we connect to other human entities for a kind of distributed problem solving. Logic is just a tool

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-20 Thread Ben Goertzel
On Sat, Sep 20, 2008 at 4:44 PM, Matt Mahoney [EMAIL PROTECTED] wrote: --- On Fri, 9/19/08, Jan Klauck [EMAIL PROTECTED] wrote: Formal logic doesn't scale up very well in humans. That's why this kind of reasoning is so unpopular. Our capacities are that small and we connect to other human

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-20 Thread Ben Goertzel
On Sat, Sep 20, 2008 at 6:24 PM, Matt Mahoney [EMAIL PROTECTED] wrote: --- On Sat, 9/20/08, Ben Goertzel [EMAIL PROTECTED] wrote: If formal reasoning were a solved problem in AI, then we would have theorem-provers that could prove deep, complex theorems unassisted. We don't. This indicates

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-20 Thread Matt Mahoney
-- Matt Mahoney, [EMAIL PROTECTED] --- On Sat, 9/20/08, Ben Goertzel [EMAIL PROTECTED] wrote: It seems a big stretch to me to call theorem-proving guidance a language modeling problem ... one may be able to make sense of this statement, but only by treating the concept of language VERY

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-20 Thread Pei Wang
Matt, I really hope NARS can be simplified, but until you give me the details, such as how to calculate the truth value in your converse rule, I cannot see how you can do the same things with a simpler design. NARS has this conversion rule, which, with the deduction rule, can replace

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-20 Thread Ben Goertzel
To pursue an overused metaphor, to me that's sort of like trying to understand flight by carefully studying the most effective high-jumpers. OK, you might learn something, but you're not getting at the crux of the problem... A more appropriate metaphor is that text compression is the

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-20 Thread Mike Tintner
Pei:In a broad sense, formal logic is nothing but domain-independent and justifiable data manipulation schemes. I haven't seen any argument for why AI cannot be achieved by implementing that Have you provided a single argument as to how logic *can* achieve AI - or to be more precise,

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-20 Thread Mike Tintner
Ben: Mike: (And can you provide an example of a single surprising metaphor or analogy that have ever been derived logically? Jiri said he could - but didn't.) It's a bad question -- one could derive surprising metaphors or analogies by random search, and that wouldn't prove anything

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-20 Thread Ben Goertzel
Mike, I understand that my task is to create an AGI system, and I'm working on it ... The fact that my in-development, partial AGI system has not yet demonstrated advanced intelligence, does not imply that it will not do so once completed. No, my AGI system has not yet discovered surprising

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-20 Thread Ben Goertzel
and not to forget... SATAN GUIDES US TELEPATHICLY THROUGH RECTAL THERMOMETERS. WHY DO YOU THINK ABOUT META-REASONING? On Sat, Sep 20, 2008 at 11:38 PM, Ben Goertzel [EMAIL PROTECTED] wrote: Mike, I understand that my task is to create an AGI system, and I'm working on it ... The fact

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-20 Thread Mike Tintner
Ben, Not one metaphor below works. You have in effect accepted the task of providing a philosophy and explanation of your AGI and your logic - you have produced a great deal of such stuff (quite correctly). But none of it includes the slightest explanation of how logic can produce AGI - or,

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-20 Thread Ben Goertzel
Mike If you want an explanation of why I think my AGI system will work, please see http://opencog.org/wiki/OpenCogPrime:WikiBook The argument is complex and technical and it would not be a good use of my time to recapitulate it via email!! Personally I do think the metaphor COWS FLY LIKE

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-20 Thread Mike Tintner
Ben, Just to be clear, when I said no argument re how logic will produce AGI.. I meant, of course, as per the previous posts, ..how logic will [surprisingly] cross domains etc. That, for me, is the defining characteristic of AGI. All the rest is narrow AI.

The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-19 Thread Matt Mahoney
--- On Fri, 9/19/08, Jiri Jelinek [EMAIL PROTECTED] wrote: Try What's the color of Dan Brown's black coat? What's the excuse for a general problem solver to fail in this case? NLP? It then should use a formal language or so. Google uses relatively good search algorithms but decent general

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-19 Thread Ben Goertzel
Matt wrote, There seems to be a lot of effort to implement reasoning in knowledge representation systems, even though it has little to do with how we actually think. Please note that not all of us in the AGI field are trying to closely emulate human thought. Human-level thought does not

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-19 Thread Eric Burton
I think the whole idea of a semantic layer is to provide the kind of mechanism for abstract reasoning that evolution seems to have built into the human brain. You could argue that those faculties are acquired during one's life, using only a weighted neural net (brain), but it seems reasonable to

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-19 Thread Trent Waddington
On Sat, Sep 20, 2008 at 8:46 AM, Matt Mahoney [EMAIL PROTECTED] wrote: But if you can learn these types of patterns then with no additional effort you can learn patterns that directly solve the problem... This kind of reminds me of the people think in their natural language theory that Steven

Re: The brain does not implement formal logic (was Re: [agi] Where the Future of AGI Lies)

2008-09-19 Thread Jan Klauck
Matt, People who haven't studied logic or its notation can certainly learn to do this type of reasoning. Formal logic doesn't scale up very well in humans. That's why this kind of reasoning is so unpopular. Our capacities are that small and we connect to other human entities for a kind of