Re: [agi] reasoning & knowledge

2008-03-13 Thread Bob Mottram
On 13/03/2008, Linas Vepstas <[EMAIL PROTECTED]> wrote: > > object itself. How, say, do you get from a human face to the distorted > > portraits of Modigliani, Picasso, Francis Bacon, Scarfe, or any > cartoonist? > > By logical or mathematical formulae? > > > Actually, yes. Computer vision p

Re: [agi] reasoning & knowledge

2008-03-13 Thread Linas Vepstas
On 14/02/2008, Mike Tintner <[EMAIL PROTECTED]> wrote: > Pei: > Though many people assume "reasoning" can only been applied to > > > "symbolic" or "linguistic" materials, I'm not convinced yet, nor that > > there is really a separate "imaginative reasoning" --- at least I > > haven't seen a concr

Re: Common Sense Consciousness [WAS Re: [agi] reasoning & knowledge]

2008-02-29 Thread Matt Mahoney
--- Mike Tintner <[EMAIL PROTECTED]> wrote: > > >Vlad:> Don't you know about change blindness and the like? You don't > >actually > > see all these details, it's delusional. You only get the gist of the > > scene, according to current context that forms the focus of your > > attention. Amount o

Re: Common Sense Consciousness [WAS Re: [agi] reasoning & knowledge]

2008-02-29 Thread Richard Loosemore
Bob Mottram wrote: On 29/02/2008, Mike Tintner <[EMAIL PROTECTED]> wrote: consciousness is a continuously moving picture with the other senses continuous too There doesn't seem to be much evidence for this. People with damage to MT, or certain types of visual migrane, see the world as a slo

Re: Common Sense Consciousness [WAS Re: [agi] reasoning & knowledge]

2008-02-29 Thread Richard Loosemore
Mike Tintner wrote: Sorry, yes the "run" is ambiguous. I mean that what the human mind does is *watch* continuous movies - but it then runs/creates its own extensive movies based on its experience in dreams - and, with some effort, replay movies in conscious imagination. The point is: my i

Re: Common Sense Consciousness [WAS Re: [agi] reasoning & knowledge]

2008-02-29 Thread Mike Tintner
Vlad:> Don't you know about change blindness and the like? You don't actually see all these details, it's delusional. You only get the gist of the scene, according to current context that forms the focus of your attention. Amount of information you extract from watching a movie is not dramatica

Re: [agi] reasoning & knowledge

2008-02-29 Thread Mike Tintner
Robert: I think it would be more accurate to say that technological meme evolution was caused by the biological evolution, rather than being the extension of it, since they are in fact two quite different evolutionary systems, with different kinds of populations/survival conditions. I would sa

Re: Common Sense Consciousness [WAS Re: [agi] reasoning & knowledge]

2008-02-29 Thread Vladimir Nesov
Mike, Don't you know about change blindness and the like? You don't actually see all these details, it's delusional. You only get the gist of the scene, according to current context that forms the focus of your attention. Amount of information you extract from watching a movie is not dramatically

Re: [agi] reasoning & knowledge

2008-02-29 Thread Robert Wensman
> > > d) you keep repeating the illusion that evolution did NOT achieve the > airplane and other machines - oh yes, it did - your central illusion here > is > that machines are independent species. They're not. They are > EXTENSIONS of > human beings, and don't work without human beings attached.

Re: Common Sense Consciousness [WAS Re: [agi] reasoning & knowledge]

2008-02-29 Thread Mike Tintner
Eh? Move your hand across the desk. You see that as a series of snapshots? Move a noisy object across. You don't see a continuous picture with a continuous soundtrack? Let me give you an example of how impressive I think the brain's powers here are. I've been thinking about metaphor and the su

Re: Common Sense Consciousness [WAS Re: [agi] reasoning & knowledge]

2008-02-29 Thread Bob Mottram
On 29/02/2008, Mike Tintner <[EMAIL PROTECTED]> wrote: > consciousness is a continuously moving picture with the other senses > continuous too There doesn't seem to be much evidence for this. People with damage to MT, or certain types of visual migrane, see the world as a slow jerky series of s

Re: Common Sense Consciousness [WAS Re: [agi] reasoning & knowledge]

2008-02-28 Thread Mike Tintner
Sorry, yes the "run" is ambiguous. I mean that what the human mind does is *watch* continuous movies - but it then runs/creates its own extensive movies based on its experience in dreams - and, with some effort, replay movies in conscious imagination. The point is: my impression is that in

Re: Common Sense Consciousness [WAS Re: [agi] reasoning & knowledge]

2008-02-28 Thread Richard Loosemore
Mike Tintner wrote: Er, just to clarify. You guys have, or know of, AI systems which run continuous movies of the world, analysing and responding to those movies with all the relevant senses, as discussed below, and then to the world beyond those movies, in real time (or any time, for that matt

Re: Common Sense Consciousness [WAS Re: [agi] reasoning & knowledge]

2008-02-28 Thread Mike Tintner
Er, just to clarify. You guys have, or know of, AI systems which run continuous movies of the world, analysing and responding to those movies with all the relevant senses, as discussed below, and then to the world beyond those movies, in real time (or any time, for that matter)? Mike Tint

Re: [agi] reasoning & knowledge

2008-02-28 Thread Jim Bromer
Ben Goertzel <[EMAIL PROTECTED]> wrote: That is purely rhetorical gamesmanship ben I studied some rhetoric and while a learned how to avoid some of the worst pitfalls of gamemanship and how to avoid wasting *all* of my time, I found that the study, which was one of Aristotle's subjects by

Re: Common Sense Consciousness [WAS Re: [agi] reasoning & knowledge]

2008-02-28 Thread Jim Bromer
Mike Tintner wrote: > You're crossing a road - you track both the oncoming car and your body > with all your senses at once - see a continuous moving image of the > car, hear the noise of the engine and tires, possibly smell it if > there's a smell of gasoline, have a kinaesthetic sense of

Re: [agi] reasoning & knowledge

2008-02-27 Thread Bob Mottram
On 27/02/2008, Ben Goertzel <[EMAIL PROTECTED]> wrote: > I don't buy that my body plays a significant role in thinking about, > for instance, > mathematics. I bet that my brain in a vat could think about math just > as well or > better than my embodied brain. > > Of course my brain is what it

Re: [agi] reasoning & knowledge

2008-02-27 Thread Ben Goertzel
I do not doubt that body-thinking exists and is important, my doubt is that it is in any AGI-useful sense "the largest part" of thinking... On Wed, Feb 27, 2008 at 1:07 PM, Mike Tintner <[EMAIL PROTECTED]> wrote: > Ben:What evidence do you have that this [body thinking] is the "largest > > part" .

Re: Common Sense Consciousness [WAS Re: [agi] reasoning & knowledge]

2008-02-27 Thread Richard Loosemore
J Storrs Hall, PhD wrote: On Wednesday 27 February 2008 12:22:30 pm, Richard Loosemore wrote: Mike Tintner wrote: As Ben said, it's something like "multisensory integrative consciousness" - i.e. you track a subject/scene with all senses simultaneously and integratedly. Conventional approaches

Re: [agi] reasoning & knowledge

2008-02-27 Thread Mike Tintner
Ben:What evidence do you have that this [body thinking] is the "largest part" ... it does not feel at all that way to me, as a subjectively-experiencing human; and I know of no evidence in this regard Like I said, I'm at the start here - and this is going against thousands of years of literat

Re: Common Sense Consciousness [WAS Re: [agi] reasoning & knowledge]

2008-02-27 Thread J Storrs Hall, PhD
On Wednesday 27 February 2008 12:22:30 pm, Richard Loosemore wrote: > Mike Tintner wrote: > > As Ben said, it's something like "multisensory integrative > > consciousness" - i.e. you track a subject/scene with all senses > > simultaneously and integratedly. > > Conventional approaches to AI may

Re: [agi] reasoning & knowledge

2008-02-27 Thread Ben Goertzel
> Well, what I and embodied cognitive science are trying to formulate > properly, both philosophically and scientifically, is why: > > a) common sense consciousness is the brain-AND-body thinking on several > levels simultaneously about any given subject... I don't buy that my body plays a si

Re: [agi] reasoning & knowledge

2008-02-27 Thread Ben Goertzel
> d) you keep repeating the illusion that evolution did NOT achieve the > airplane and other machines - oh yes, it did - your central illusion here is > that machines are independent species. They're not. They are EXTENSIONS of > human beings, and don't work without human beings attached. Mani

Common Sense Consciousness [WAS Re: [agi] reasoning & knowledge]

2008-02-27 Thread Richard Loosemore
Mike Tintner wrote: Richard: Mike Tintner wrote: No one in AGI is aiming for common sense consciousness, are they? Inasmuch as I understand what you mean by that, yes of course. Both common sense and consciousness. As Ben said, it's something like "multisensory integrative consciousness"

Re: [agi] reasoning & knowledge

2008-02-27 Thread Mike Tintner
Ben: MT:>> You guys seem to think this - true common sense consciousness - can all be cracked in a year or two. I think there's probably a lot of good reasons - and therefore major creative problems - why it took a billion years of evolution to achieve. Ben: I'm not trying to emulate th

Re: [agi] reasoning & knowledge

2008-02-27 Thread Bob Mottram
What I tried to do with robocore is have a number of subsystems dedicated to particular modalities such as vision, touch, hearing, smell and so on. Each of these modalities operates in a semi independent and self organised way, and their function is to create stable abstractions from the raw data

Re: [agi] reasoning & knowledge

2008-02-26 Thread Ben Goertzel
> You guys seem to think this - true common sense consciousness - can all be > cracked in a year or two. I think there's probably a lot of good reasons - > and therefore major creative problems - why it took a billion years of > evolution to achieve. I'm not trying to emulate the brain. Evolu

Re: [agi] reasoning & knowledge

2008-02-26 Thread Mike Tintner
Richard: Mike Tintner wrote: No one in AGI is aiming for common sense consciousness, are they? Inasmuch as I understand what you mean by that, yes of course. Both common sense and consciousness. As Ben said, it's something like "multisensory integrative consciousness" - i.e. you track a s

Re: [agi] reasoning & knowledge

2008-02-26 Thread Richard Loosemore
Mike Tintner wrote: No one in AGI is aiming for common sense consciousness, are they? Inasmuch as I understand what you mean by that, yes of course. Both common sense and consciousness. Richard Loosemore --- agi Archives: http://www.listbox.com/memb

Re: [agi] reasoning & knowledge

2008-02-26 Thread Ben Goertzel
> > No one in AGI is aiming for common sense consciousness, are they? > The OpenCog and NM architectures are in principle supportive of this kind of multisensory integrative consciousness, but not a lot of thought has gone into exactly how to support it ... In one approach, one would want to have

Re: [agi] reasoning & knowledge

2008-02-26 Thread Mike Tintner
Bob: It's this linguistic tagging of constellations of modality specific representations which is the real power of the human brain, since it can permit arbitrary configurations to be conjured up (literally re-membered) in a manner which is reasonably efficient from an information compression pers

Re: [agi] reasoning & knowledge

2008-02-26 Thread Mike Tintner
Ben: Anyway, I agree with you that formal logical rules and inference are not the end-all of AGI and are not the right tool for handling visual imagination or motor learning. But I do think they have an important role to play even so. Just one thought here that is worth trying to express, althoug

Re: [agi] reasoning & knowledge

2008-02-26 Thread Bob Mottram
On 26/02/2008, Mike Tintner <[EMAIL PROTECTED]> wrote: > The idea that an AGI can symbolically encode all the knowledge, and perform > all the thinking, necessary to produce, say, a golf swing, let alone play a > symphony, is a pure fantasy. Our system keeps that knowledge and thinking > large

Re: [agi] reasoning & knowledge

2008-02-26 Thread Jim Bromer
Ben Goertzel <[EMAIL PROTECTED]> wrote: Anyway, I agree with you that formal logical rules and inference are not the end-all of AGI and are not the right tool for handling visual imagination or motor learning. But I do think they have an important role to play even so. -- Ben G Well, pure clo

Re: [agi] reasoning & knowledge

2008-02-26 Thread BillK
On Tue, Feb 26, 2008 at 8:29 PM, Ben Goertzel wrote: > > I don't think that formal logic is a suitably convenient language for > describing > motor movements or dealing with motor learning. > > But still, I strongly suspect one can produce software programs that do > handle > motor movement

Re: [agi] reasoning & knowledge

2008-02-26 Thread Ben Goertzel
> Your piano example is a good one. > > What it illustrates, I suggest, is: > > your knowledge of, and thinking about, how to play the piano, and perform > the many movements involved, is overwhelmingly imaginative and body > knowledge/thinking (contained in images and the motor parts of the b

Re: [agi] reasoning & knowledge

2008-02-26 Thread Mike Tintner
Ben: One advantage AGIs will have over humans is better methods for translating procedural to declarative knowledge, and vice versa. For us to translate "knowing how to do X" into "knowing how we do X" can be really difficult (I play piano improvisationally and by ear, and I have a hard time fi

Re: [agi] reasoning & knowledge

2008-02-26 Thread J Storrs Hall, PhD
On Tuesday 26 February 2008 12:33:32 pm, Jim Bromer wrote: > There is a lot of evidence that children do not learn through imitation, at least not in its truest sense. Haven't heard of any children born into, say, a purely French-speaking household suddenly acquiring a full-blown competence in

Re: [agi] reasoning & knowledge

2008-02-26 Thread Jim Bromer
Vladimir Nesov <[EMAIL PROTECTED]> wrote: Plus, I like to think about learning as a kind of imitation, and procedural imitation seems more direct. It's "substrate starting to imitate (adapt to) process with which it interacted" as opposed to "a system that observes a process, and then controls in

Re: [agi] reasoning & knowledge

2008-02-26 Thread Vladimir Nesov
On Tue, Feb 26, 2008 at 6:10 PM, Ben Goertzel <[EMAIL PROTECTED]> wrote: > > Knowing how to carry out inference can itself be procedural knowledge, > > in which case no explicit distinction between the two is required. > > > > -- > > Vladimir Nesov > > Representationally, the same formalis

Re: [agi] reasoning & knowledge

2008-02-26 Thread Ben Goertzel
> Knowing how to carry out inference can itself be procedural knowledge, > in which case no explicit distinction between the two is required. > > -- > Vladimir Nesov Representationally, the same formalisms can of course be used for both procedural and declarative knowledge. The slightly subtl

Re: [agi] reasoning & knowledge

2008-02-26 Thread Ben Goertzel
YKY, I'm with Pei on this one... Decades of trying to do procedure learning using logic have led only to some very brittle planners that are useful under very special and restrictive assumptions... Some of that work is useful but it doesn't seem to me to be pointing in an AGI direction. OTOH fo

Re: [agi] reasoning & knowledge

2008-02-26 Thread Vladimir Nesov
On Tue, Feb 26, 2008 at 3:03 PM, YKY (Yan King Yin) <[EMAIL PROTECTED]> wrote: > > Also, if you include procedural knowledge, things may be learned doubly in > your KB. For example, you may learn some declarative knowledge about the > concept of "reverse" and also procedural knowledge of how to re

Re: [agi] reasoning & knowledge

2008-02-26 Thread Pei Wang
On Tue, Feb 26, 2008 at 7:03 AM, YKY (Yan King Yin) <[EMAIL PROTECTED]> wrote: > > On 2/15/08, Pei Wang <[EMAIL PROTECTED]> wrote: > > > > To me, the following two questions are independent of each other: > > > > *. What type of reasoning is needed for AI? The major answers are: > > (A): deduction

Re: [agi] reasoning & knowledge

2008-02-26 Thread YKY (Yan King Yin)
On 2/15/08, Pei Wang <[EMAIL PROTECTED]> wrote: > > To me, the following two questions are independent of each other: > > *. What type of reasoning is needed for AI? The major answers are: > (A): deduction only, (B) multiple types, including deduction, > induction, abduction, analogy, etc. > > *. W

Re: [agi] reasoning & knowledge.. p.s.

2008-02-15 Thread Stephen Reed
David said: Most of the people on this list have quite different ideas about how an AGI should be made BUT I think there are a few things that most, if not all agree on. 1. Intelligence can be created by using computers that exist today using software.

RE: [agi] reasoning & knowledge.. p.s.

2008-02-15 Thread David Clark
g software is possible or not. David Clark > -Original Message- > From: Pei Wang [mailto:[EMAIL PROTECTED] > Sent: February-14-08 5:11 PM > To: agi@v2.listbox.com > Subject: Re: [agi] reasoning & knowledge.. p.s. > > You are correct that MOST PEOPLE in AI treat obs

Re: [agi] reasoning & knowledge.. p.s.

2008-02-14 Thread Pei Wang
On Thu, Feb 14, 2008 at 6:41 PM, Mike Tintner <[EMAIL PROTECTED]> wrote: > Pei, > > A misunderstanding. My point was not about the psychology of > observation/vision. I understand well that psychology and philosophy are > increasingly treating it as more active/reasoned and implicitly reference

Re: [agi] reasoning & knowledge.. p.s.

2008-02-14 Thread Mike Tintner
Pei, A misunderstanding. My point was not about the psychology of observation/vision. I understand well that psychology and philosophy are increasingly treating it as more active/reasoned and implicitly referenced Noe. My point is that *AI* and *AGI* treat observation as if it is passive rat

Re: [agi] reasoning & knowledge.. p.s.

2008-02-14 Thread Bob Mottram
On 14/02/2008, Ben Goertzel <[EMAIL PROTECTED]> wrote: > Who knows what we might have achieved had that level of dedication actually > continued for 4-7 more years? This kind of frustration is familiar to most inventors, and probably most people on this list. Likewise I'm pretty sure that if I

Re: [agi] reasoning & knowledge

2008-02-14 Thread Pei Wang
You don't need to keep my busy --- I'm already too busy to continue this discussion. I don't have all the answers to your questions. For the ones I do have answers, I'm afraid I don't have the time to explain them to your satisfaction. Pei On Thu, Feb 14, 2008 at 5:23 PM, Mike Tintner <[EMAIL PR

Re: [agi] reasoning & knowledge.. p.s.

2008-02-14 Thread Ben Goertzel
Hi Mike, > P.S. I also came across this lesson that AGI forecasting must stop (I used > to make similar mistakes elsewhere). > > "We've been at it since mid-1998, and we estimate that within 1-3 years from > the time I'm writing this (March 2001), we will complete the creation of a > program

Re: [agi] reasoning & knowledge

2008-02-14 Thread Mike Tintner
Pei: > Though many people assume "reasoning" can only been applied to "symbolic" or "linguistic" materials, I'm not convinced yet, nor that there is really a separate "imaginative reasoning" --- at least I haven't seen a concrete proposal on what it means and why it is different. I should be s

Re: [agi] reasoning & knowledge.. p.s.

2008-02-14 Thread Pei Wang
On Thu, Feb 14, 2008 at 3:39 PM, Mike Tintner <[EMAIL PROTECTED]> wrote: > > Everyone is talking about observation as if it is PASSIVE - as if you just > record the world and THEN you start reasoning. Mike: I really hope you can stop making this kind of claim, for your own sake. For what people

Re: [agi] reasoning & knowledge.. p.s.

2008-02-14 Thread Mike Tintner
Pei: What type of reasoning is needed for AI? The major answers are: (A): deduction only, (B) multiple types, including deduction, induction, abduction, analogy, etc. And the other thing that AI presumably lacks currently - this sounds so obvious as to be almost silly to say, but I can't reme

Re: [agi] reasoning & knowledge

2008-02-14 Thread Pei Wang
Though many people assume "reasoning" can only been applied to "symbolic" or "linguistic" materials, I'm not convinced yet, nor that there is really a separate "imaginative reasoning" --- at least I haven't seen a concrete proposal on what it means and why it is different. For a simple deduction r

Re: [agi] reasoning & knowledge

2008-02-14 Thread Mike Tintner
Pei: What type of reasoning is needed for AI? The major answers are: (A): deduction only, (B) multiple types, including deduction, induction, abduction, analogy, etc. Is it fair to say that current AI involves an absence of imaginative reasoning? - reasoning that is conducted more or less

Re: [agi] reasoning & knowledge

2008-02-14 Thread Stephen Reed
t;[EMAIL PROTECTED]> To: agi@v2.listbox.com Sent: Thursday, February 14, 2008 10:28:50 AM Subject: [agi] reasoning & knowledge Steve, To me, the following two questions are independent of each other: *. What type of reasoning is needed for AI? The major answers are:

[agi] reasoning & knowledge

2008-02-14 Thread Pei Wang
Steve, To me, the following two questions are independent of each other: *. What type of reasoning is needed for AI? The major answers are: (A): deduction only, (B) multiple types, including deduction, induction, abduction, analogy, etc. *. What type of knowledge should be reasoned upon? The maj