On 13/03/2008, Linas Vepstas <[EMAIL PROTECTED]> wrote:
> > object itself. How, say, do you get from a human face to the distorted
> > portraits of Modigliani, Picasso, Francis Bacon, Scarfe, or any
> cartoonist?
> > By logical or mathematical formulae?
>
>
> Actually, yes. Computer vision p
On 14/02/2008, Mike Tintner <[EMAIL PROTECTED]> wrote:
> Pei: > Though many people assume "reasoning" can only been applied to
>
> > "symbolic" or "linguistic" materials, I'm not convinced yet, nor that
> > there is really a separate "imaginative reasoning" --- at least I
> > haven't seen a concr
--- Mike Tintner <[EMAIL PROTECTED]> wrote:
>
> >Vlad:> Don't you know about change blindness and the like? You don't
> >actually
> > see all these details, it's delusional. You only get the gist of the
> > scene, according to current context that forms the focus of your
> > attention. Amount o
Bob Mottram wrote:
On 29/02/2008, Mike Tintner <[EMAIL PROTECTED]> wrote:
consciousness is a continuously moving picture with the other senses
continuous too
There doesn't seem to be much evidence for this. People with damage
to MT, or certain types of visual migrane, see the world as a slo
Mike Tintner wrote:
Sorry, yes the "run" is ambiguous.
I mean that what the human mind does is *watch* continuous movies - but
it then runs/creates its own extensive movies based on its experience in
dreams - and, with some effort, replay movies in conscious imagination.
The point is: my i
Vlad:> Don't you know about change blindness and the like? You don't
actually
see all these details, it's delusional. You only get the gist of the
scene, according to current context that forms the focus of your
attention. Amount of information you extract from watching a movie is
not dramatica
Robert:
I think it would be more accurate to say that technological meme evolution was
caused by the biological evolution, rather than being the extension of it,
since they are in fact two quite different evolutionary systems, with different
kinds of populations/survival conditions.
I would sa
Mike,
Don't you know about change blindness and the like? You don't actually
see all these details, it's delusional. You only get the gist of the
scene, according to current context that forms the focus of your
attention. Amount of information you extract from watching a movie is
not dramatically
>
>
> d) you keep repeating the illusion that evolution did NOT achieve the
> airplane and other machines - oh yes, it did - your central illusion here
> is
> that machines are independent species. They're not. They are
> EXTENSIONS of
> human beings, and don't work without human beings attached.
Eh? Move your hand across the desk. You see that as a series of snapshots?
Move a noisy object across. You don't see a continuous picture with a
continuous soundtrack?
Let me give you an example of how impressive I think the brain's powers here
are. I've been thinking about metaphor and the su
On 29/02/2008, Mike Tintner <[EMAIL PROTECTED]> wrote:
> consciousness is a continuously moving picture with the other senses
> continuous too
There doesn't seem to be much evidence for this. People with damage
to MT, or certain types of visual migrane, see the world as a slow
jerky series of s
Sorry, yes the "run" is ambiguous.
I mean that what the human mind does is *watch* continuous movies - but it
then runs/creates its own extensive movies based on its experience in
dreams - and, with some effort, replay movies in conscious imagination.
The point is: my impression is that in
Mike Tintner wrote:
Er, just to clarify. You guys have, or know of, AI systems which run
continuous movies of the world, analysing and responding to those movies
with all the relevant senses, as discussed below, and then to the world
beyond those movies, in real time (or any time, for that matt
Er, just to clarify. You guys have, or know of, AI systems which run continuous
movies of the world, analysing and responding to those movies with all the
relevant senses, as discussed below, and then to the world beyond those movies,
in real time (or any time, for that matter)?
Mike Tint
Ben Goertzel <[EMAIL PROTECTED]> wrote:
That is purely rhetorical gamesmanship
ben
I studied some rhetoric and while a learned how to avoid some of the worst
pitfalls of gamemanship and how to avoid wasting *all* of my time, I found that
the study, which was one of Aristotle's subjects by
Mike Tintner wrote:
> You're crossing a road - you track both the oncoming car and your body
> with all your senses at once - see a continuous moving image of the
> car, hear the noise of the engine and tires, possibly smell it if
> there's a smell of gasoline, have a kinaesthetic sense of
On 27/02/2008, Ben Goertzel <[EMAIL PROTECTED]> wrote:
> I don't buy that my body plays a significant role in thinking about,
> for instance,
> mathematics. I bet that my brain in a vat could think about math just
> as well or
> better than my embodied brain.
>
> Of course my brain is what it
I do not doubt that body-thinking exists and is important, my doubt is that it
is in any AGI-useful sense "the largest part" of thinking...
On Wed, Feb 27, 2008 at 1:07 PM, Mike Tintner <[EMAIL PROTECTED]> wrote:
> Ben:What evidence do you have that this [body thinking] is the "largest
>
> part" .
J Storrs Hall, PhD wrote:
On Wednesday 27 February 2008 12:22:30 pm, Richard Loosemore wrote:
Mike Tintner wrote:
As Ben said, it's something like "multisensory integrative
consciousness" - i.e. you track a subject/scene with all senses
simultaneously and integratedly.
Conventional approaches
Ben:What evidence do you have that this [body thinking] is the "largest
part" ... it does
not feel at all
that way to me, as a subjectively-experiencing human; and I know of no
evidence
in this regard
Like I said, I'm at the start here - and this is going against thousands of
years of literat
On Wednesday 27 February 2008 12:22:30 pm, Richard Loosemore wrote:
> Mike Tintner wrote:
> > As Ben said, it's something like "multisensory integrative
> > consciousness" - i.e. you track a subject/scene with all senses
> > simultaneously and integratedly.
>
> Conventional approaches to AI may
> Well, what I and embodied cognitive science are trying to formulate
> properly, both philosophically and scientifically, is why:
>
> a) common sense consciousness is the brain-AND-body thinking on several
> levels simultaneously about any given subject...
I don't buy that my body plays a si
> d) you keep repeating the illusion that evolution did NOT achieve the
> airplane and other machines - oh yes, it did - your central illusion here is
> that machines are independent species. They're not. They are EXTENSIONS of
> human beings, and don't work without human beings attached. Mani
Mike Tintner wrote:
Richard: Mike Tintner wrote:
No one in AGI is aiming for common sense consciousness, are they?
Inasmuch as I understand what you mean by that, yes of course.
Both common sense and consciousness.
As Ben said, it's something like "multisensory integrative
consciousness"
Ben: MT:>> You guys seem to think this - true common sense consciousness -
can all be
cracked in a year or two. I think there's probably a lot of good
reasons -
and therefore major creative problems - why it took a billion years of
evolution to achieve.
Ben: I'm not trying to emulate th
What I tried to do with robocore is have a number of subsystems
dedicated to particular modalities such as vision, touch, hearing,
smell and so on. Each of these modalities operates in a semi
independent and self organised way, and their function is to create
stable abstractions from the raw data
> You guys seem to think this - true common sense consciousness - can all be
> cracked in a year or two. I think there's probably a lot of good reasons -
> and therefore major creative problems - why it took a billion years of
> evolution to achieve.
I'm not trying to emulate the brain.
Evolu
Richard: Mike Tintner wrote:
No one in AGI is aiming for common sense consciousness, are they?
Inasmuch as I understand what you mean by that, yes of course.
Both common sense and consciousness.
As Ben said, it's something like "multisensory integrative consciousness" -
i.e. you track a s
Mike Tintner wrote:
No one in AGI is aiming for common sense consciousness, are they?
Inasmuch as I understand what you mean by that, yes of course.
Both common sense and consciousness.
Richard Loosemore
---
agi
Archives: http://www.listbox.com/memb
>
> No one in AGI is aiming for common sense consciousness, are they?
>
The OpenCog and NM architectures are in principle supportive of this kind
of multisensory integrative consciousness, but not a lot of thought has gone
into exactly how to support it ...
In one approach, one would want to have
Bob: It's this linguistic tagging of
constellations of modality specific representations which is the real
power of the human brain, since it can permit arbitrary configurations
to be conjured up (literally re-membered) in a manner which is
reasonably efficient from an information compression pers
Ben: Anyway, I agree with you that formal logical rules and inference are not
the
end-all of AGI and are not the right tool for handling visual imagination or
motor learning. But I do think they have an important role to play even so.
Just one thought here that is worth trying to express, althoug
On 26/02/2008, Mike Tintner <[EMAIL PROTECTED]> wrote:
> The idea that an AGI can symbolically encode all the knowledge, and perform
> all the thinking, necessary to produce, say, a golf swing, let alone play a
> symphony, is a pure fantasy. Our system keeps that knowledge and thinking
> large
Ben Goertzel <[EMAIL PROTECTED]> wrote:
Anyway, I agree with you that formal logical rules and inference are not the
end-all of AGI and are not the right tool for handling visual imagination or
motor learning. But I do think they have an important role to play even so.
-- Ben G
Well, pure clo
On Tue, Feb 26, 2008 at 8:29 PM, Ben Goertzel wrote:
>
> I don't think that formal logic is a suitably convenient language for
> describing
> motor movements or dealing with motor learning.
>
> But still, I strongly suspect one can produce software programs that do
> handle
> motor movement
> Your piano example is a good one.
>
> What it illustrates, I suggest, is:
>
> your knowledge of, and thinking about, how to play the piano, and perform
> the many movements involved, is overwhelmingly imaginative and body
> knowledge/thinking (contained in images and the motor parts of the b
Ben: One advantage AGIs will have over humans is better methods for
translating
procedural to declarative knowledge, and vice versa. For us to translate
"knowing how to do X" into
"knowing how we do X" can be really difficult (I play piano
improvisationally and by
ear, and I have a hard time fi
On Tuesday 26 February 2008 12:33:32 pm, Jim Bromer wrote:
> There is a lot of evidence that children do not learn through imitation, at
least not in its truest sense.
Haven't heard of any children born into, say, a purely French-speaking
household suddenly acquiring a full-blown competence in
Vladimir Nesov <[EMAIL PROTECTED]> wrote: Plus, I like to
think about learning as a kind of imitation, and procedural imitation
seems more direct. It's "substrate starting to imitate (adapt to)
process with which it interacted" as opposed to "a system that
observes a process, and then controls in
On Tue, Feb 26, 2008 at 6:10 PM, Ben Goertzel <[EMAIL PROTECTED]> wrote:
> > Knowing how to carry out inference can itself be procedural knowledge,
> > in which case no explicit distinction between the two is required.
> >
> > --
> > Vladimir Nesov
>
> Representationally, the same formalis
> Knowing how to carry out inference can itself be procedural knowledge,
> in which case no explicit distinction between the two is required.
>
> --
> Vladimir Nesov
Representationally, the same formalisms can of course be used for both
procedural and declarative knowledge.
The slightly subtl
YKY,
I'm with Pei on this one...
Decades of trying to do procedure learning using logic have led only
to some very
brittle planners that are useful under very special and restrictive
assumptions...
Some of that work is useful but it doesn't seem to me to be pointing in an AGI
direction.
OTOH fo
On Tue, Feb 26, 2008 at 3:03 PM, YKY (Yan King Yin)
<[EMAIL PROTECTED]> wrote:
>
> Also, if you include procedural knowledge, things may be learned doubly in
> your KB. For example, you may learn some declarative knowledge about the
> concept of "reverse" and also procedural knowledge of how to re
On Tue, Feb 26, 2008 at 7:03 AM, YKY (Yan King Yin)
<[EMAIL PROTECTED]> wrote:
>
> On 2/15/08, Pei Wang <[EMAIL PROTECTED]> wrote:
> >
> > To me, the following two questions are independent of each other:
> >
> > *. What type of reasoning is needed for AI? The major answers are:
> > (A): deduction
On 2/15/08, Pei Wang <[EMAIL PROTECTED]> wrote:
>
> To me, the following two questions are independent of each other:
>
> *. What type of reasoning is needed for AI? The major answers are:
> (A): deduction only, (B) multiple types, including deduction,
> induction, abduction, analogy, etc.
>
> *. W
David said:
Most of the people on this list have quite different ideas about
how an AGI
should be made BUT I think there are a few things that most, if
not all
agree on.
1. Intelligence can be created by using computers that exist today
using
software.
g software is possible or not.
David Clark
> -Original Message-
> From: Pei Wang [mailto:[EMAIL PROTECTED]
> Sent: February-14-08 5:11 PM
> To: agi@v2.listbox.com
> Subject: Re: [agi] reasoning & knowledge.. p.s.
>
> You are correct that MOST PEOPLE in AI treat obs
On Thu, Feb 14, 2008 at 6:41 PM, Mike Tintner <[EMAIL PROTECTED]> wrote:
> Pei,
>
> A misunderstanding. My point was not about the psychology of
> observation/vision. I understand well that psychology and philosophy are
> increasingly treating it as more active/reasoned and implicitly reference
Pei,
A misunderstanding. My point was not about the psychology of
observation/vision. I understand well that psychology and philosophy are
increasingly treating it as more active/reasoned and implicitly referenced
Noe. My point is that *AI* and *AGI* treat observation as if it is passive
rat
On 14/02/2008, Ben Goertzel <[EMAIL PROTECTED]> wrote:
> Who knows what we might have achieved had that level of dedication actually
> continued for 4-7 more years?
This kind of frustration is familiar to most inventors, and probably
most people on this list. Likewise I'm pretty sure that if I
You don't need to keep my busy --- I'm already too busy to continue
this discussion.
I don't have all the answers to your questions. For the ones I do have
answers, I'm afraid I don't have the time to explain them to your
satisfaction.
Pei
On Thu, Feb 14, 2008 at 5:23 PM, Mike Tintner <[EMAIL PR
Hi Mike,
> P.S. I also came across this lesson that AGI forecasting must stop (I used
> to make similar mistakes elsewhere).
>
> "We've been at it since mid-1998, and we estimate that within 1-3 years from
> the time I'm writing this (March 2001), we will complete the creation of a
> program
Pei: > Though many people assume "reasoning" can only been applied to
"symbolic" or "linguistic" materials, I'm not convinced yet, nor that
there is really a separate "imaginative reasoning" --- at least I
haven't seen a concrete proposal on what it means and why it is
different.
I should be s
On Thu, Feb 14, 2008 at 3:39 PM, Mike Tintner <[EMAIL PROTECTED]> wrote:
>
> Everyone is talking about observation as if it is PASSIVE - as if you just
> record the world and THEN you start reasoning.
Mike: I really hope you can stop making this kind of claim, for your own sake.
For what people
Pei: What type of reasoning is needed for AI? The major answers are:
(A): deduction only, (B) multiple types, including deduction,
induction, abduction, analogy, etc.
And the other thing that AI presumably lacks currently - this sounds so
obvious as to be almost silly to say, but I can't reme
Though many people assume "reasoning" can only been applied to
"symbolic" or "linguistic" materials, I'm not convinced yet, nor that
there is really a separate "imaginative reasoning" --- at least I
haven't seen a concrete proposal on what it means and why it is
different.
For a simple deduction r
Pei: What type of reasoning is needed for AI? The major answers are:
(A): deduction only, (B) multiple types, including deduction,
induction, abduction, analogy, etc.
Is it fair to say that current AI involves an absence of imaginative
reasoning? - reasoning that is conducted more or less
t;[EMAIL PROTECTED]>
To: agi@v2.listbox.com
Sent: Thursday, February 14, 2008 10:28:50 AM
Subject: [agi] reasoning & knowledge
Steve,
To me, the following two questions are independent of each other:
*. What type of reasoning is needed for AI? The major answers are:
Steve,
To me, the following two questions are independent of each other:
*. What type of reasoning is needed for AI? The major answers are:
(A): deduction only, (B) multiple types, including deduction,
induction, abduction, analogy, etc.
*. What type of knowledge should be reasoned upon? The maj
59 matches
Mail list logo