[agi] Emergent languages Org

2008-02-03 Thread Mike Tintner
Jeez there's always something new. Anyone know about this (which seems at a 
glance loosely relevant to Ben's approach) ?


http://www.emergent-languages.org/

Overview

This site provides an introduction to the research on emergent and 
evolutionary languages as conducted at the Sony Computer Science Laboratory 
in Paris and the AI-Lab at the VUB in Brussels. One of the principle 
objectives of this research is to identify the cognitive capabilities that 
artificial agents must posses to enable, in a population of such agents, the 
emergence and evolution of a language that exhibits characteristic features 
identified in natural languages.


Looks like Sony- Aibo- financed. Luc Steels seems to be a principal figure. 
This is quite fun:


http://www.csl.sony.fr/~py/clickerTraining.htm

Here he explains/justifies his approach:

http://www.csl.sony.fr/downloads/papers/2006/steels-06a.pdf

And how did I get to all this? From, tangentially, Construction Grammar, 
which is yet another interesting aspect of cognitive linguistics:


http://en.wikipedia.org/wiki/Construction_grammar 



-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=93139505-4aa549


Re: [agi] Emergent languages Org

2008-02-03 Thread Bob Mottram
I havn't read any of Luc Steels stuff for a long time, but he has been
researching the evolution of language using robots or software agents
since the early 1990s.  This is really a symbol grounding problem
where the communication in some way needs to represent things or
situations which the agent can perceive with its sensors.

Some years ago I tried to do something similar to Pierre Oudeyers
video using a humanoid robot - presenting objects and saying "this is
a..." or "what is this?" or "Is this a...?".  I didn't go very far
down this route because I found that visual recognition of objects
constitutes the major part of the problem.  It is possible to use SIFT
features and geometric hashes (which I think is what the AIBO robot is
doing in this demo) but these 2D methods just aren't very good on
objects with complicated 3D shapes.  Since I'm interested in making
machines which are genuinely intelligent, as opposed to appearing to
be intelligent in a five minute demo, I've spent most of my efforts on
the 3D object recognition problem.  It turns out that other things are
fundamentally related to this problem, such as mapping, navigation and
SLAM.



On 03/02/2008, Mike Tintner <[EMAIL PROTECTED]> wrote:
> Jeez there's always something new. Anyone know about this (which seems at a
> glance loosely relevant to Ben's approach) ?
>
> http://www.emergent-languages.org/
>
> Overview
>
> This site provides an introduction to the research on emergent and
> evolutionary languages as conducted at the Sony Computer Science Laboratory
> in Paris and the AI-Lab at the VUB in Brussels. One of the principle
> objectives of this research is to identify the cognitive capabilities that
> artificial agents must posses to enable, in a population of such agents, the
> emergence and evolution of a language that exhibits characteristic features
> identified in natural languages.
>
> Looks like Sony- Aibo- financed. Luc Steels seems to be a principal figure.
> This is quite fun:
>
> http://www.csl.sony.fr/~py/clickerTraining.htm
>
> Here he explains/justifies his approach:
>
> http://www.csl.sony.fr/downloads/papers/2006/steels-06a.pdf
>
> And how did I get to all this? From, tangentially, Construction Grammar,
> which is yet another interesting aspect of cognitive linguistics:
>
> http://en.wikipedia.org/wiki/Construction_grammar
>
>
> -
> This list is sponsored by AGIRI: http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
> http://v2.listbox.com/member/?&;
>

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=93142547-2c3dab


Re: [agi] Emergent languages Org

2008-02-03 Thread Ben Goertzel
Thanks for the references...

I found this paper

Kaplan, F., Oudeyer, P-Y., Kubinyi, E. and Miklosi, A. (2002) Robotic
clicker training, Robotics and Autonomous Systems, 38(3-4), pp.
197--206.

at (near the bottom)

http://www.csl.sony.fr/~py/clickerTraining.htm

interesting in terms of highlighting the difference btw virtual-world
and physical-robotics teaching of agents, as well as the basic
difference between Novamente's Virtual Animal Brain system and real
dog brains...

They point out that imitation learning is rarely used for teaching
animals, both because animals are bad at imitation, and because of
differences between human and animal anatomy.

However, Novamente is good at imitation, and in a virtual-world
context the differences between human and animal anatomy can be
finessed pretty easily (via simply supplying the virtual animal with
suggestions about how to map specific human-avatar animations into
specific animal-avatar animations).

What they advocate in the paper, for teaching robots, is "clicker
training" which is basically Skinnerian reinforcement learning with a
judicious, time-variant sequence of partial rewards.  At first you
reward the animal for doing 1/10 of the behavior right, then after it
can do that, you reward it for doing 2/10 of the behavior right, etc.

In their work on language learning

http://www.csl.sony.fr/~py/languageAcquisition.htm

I see nothing coming remotely close to a discussion of the learning of
syntax or complex semantics ... what I see is some experiments in
which robots learned, through spontaneous exploration and
reinforcement, the simple fact that vocalizing toward other agents is
a useful thing to do.  Which is certainly interesting ... but it's
really just a matter of "learning THAT vocal communication exists", in
a setting where not that many other possibilities exist...

-- Ben G


On Feb 3, 2008 7:08 AM, Mike Tintner <[EMAIL PROTECTED]> wrote:
> Jeez there's always something new. Anyone know about this (which seems at a
> glance loosely relevant to Ben's approach) ?
>
> http://www.emergent-languages.org/
>
> Overview
>
> This site provides an introduction to the research on emergent and
> evolutionary languages as conducted at the Sony Computer Science Laboratory
> in Paris and the AI-Lab at the VUB in Brussels. One of the principle
> objectives of this research is to identify the cognitive capabilities that
> artificial agents must posses to enable, in a population of such agents, the
> emergence and evolution of a language that exhibits characteristic features
> identified in natural languages.
>
> Looks like Sony- Aibo- financed. Luc Steels seems to be a principal figure.
> This is quite fun:
>
> http://www.csl.sony.fr/~py/clickerTraining.htm
>
> Here he explains/justifies his approach:
>
> http://www.csl.sony.fr/downloads/papers/2006/steels-06a.pdf
>
> And how did I get to all this? From, tangentially, Construction Grammar,
> which is yet another interesting aspect of cognitive linguistics:
>
> http://en.wikipedia.org/wiki/Construction_grammar
>
>
> -
> This list is sponsored by AGIRI: http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
> http://v2.listbox.com/member/?&;
>



-- 
Ben Goertzel, PhD
CEO, Novamente LLC and Biomind LLC
Director of Research, SIAI
[EMAIL PROTECTED]

"If men cease to believe that they will one day become gods then they
will surely become worms."
-- Henry Miller

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=93159821-0ce2a9


Re: [agi] Emergent languages Org

2008-02-03 Thread Stephen Reed
I have been collaborating with this lab on their Fluid Construction Grammar 
system, as described briefly in this blog post: 
http://texai.org/blog/2007/10/24/fluid-construction-grammar

I downloaded their Common Lisp implementation and rewrote it in Java and 
demonstrated that I could achieve the same results as their Lisp 
implementation.  Then I extended it to parse incrementally, e.g. word-by-word, 
strictly left-to-right, creating semantics at each step.

I have not studied the theory of emerging languages as I am focused on what I 
think is their excellent production rule engine for bi-directional grammars.  I 
would be glad to provide an introduction on the linkedin network to Pieter 
Wellens, who is a PhD student at the associated VUB AI-Lab.

-Steve
 
Stephen L. Reed 
Artificial Intelligence Researcher
http://texai.org/blog
http://texai.org
3008 Oak Crest Ave.
Austin, Texas, USA 78704
512.791.7860

- Original Message 
From: Ben Goertzel <[EMAIL PROTECTED]>
To: agi@v2.listbox.com
Sent: Sunday, February 3, 2008 8:03:13 AM
Subject: Re: [agi] Emergent languages Org

 Thanks  for  the  references...

I  found  this  paper

Kaplan,  F.,  Oudeyer,  P-Y.,  Kubinyi,  E.  and  Miklosi,  A.  (2002)  Robotic
clicker  training,  Robotics  and  Autonomous  Systems,  38(3-4),  pp.
197--206.

at  (near  the  bottom)

http://www.csl.sony.fr/~py/clickerTraining.htm

interesting  in  terms  of  highlighting  the  difference  btw  virtual-world
and  physical-robotics  teaching  of  agents,  as  well  as  the  basic
difference  between  Novamente's  Virtual  Animal  Brain  system  and  real
dog  brains...

They  point  out  that  imitation  learning  is  rarely  used  for  teaching
animals,  both  because  animals  are  bad  at  imitation,  and  because  of
differences  between  human  and  animal  anatomy.

However,  Novamente  is  good  at  imitation,  and  in  a  virtual-world
context  the  differences  between  human  and  animal  anatomy  can  be
finessed  pretty  easily  (via  simply  supplying  the  virtual  animal  with
suggestions  about  how  to  map  specific  human-avatar  animations  into
specific  animal-avatar  animations).

What  they  advocate  in  the  paper,  for  teaching  robots,  is  "clicker
training"  which  is  basically  Skinnerian  reinforcement  learning  with  a
judicious,  time-variant  sequence  of  partial  rewards.   At  first  you
reward  the  animal  for  doing  1/10  of  the  behavior  right,  then  after  
it
can  do  that,  you  reward  it  for  doing  2/10  of  the  behavior  right,  
etc.

In  their  work  on  language  learning

http://www.csl.sony.fr/~py/languageAcquisition.htm

I  see  nothing  coming  remotely  close  to  a  discussion  of  the  learning  
of
syntax  or  complex  semantics  ...  what  I  see  is  some  experiments  in
which  robots  learned,  through  spontaneous  exploration  and
reinforcement,  the  simple  fact  that  vocalizing  toward  other  agents  is
a  useful  thing  to  do.   Which  is  certainly  interesting  ...  but  it's
really  just  a  matter  of  "learning  THAT  vocal  communication  exists",  in
a  setting  where  not  that  many  other  possibilities  exist...

--  Ben  G


On  Feb  3,  2008  7:08  AM,  Mike  Tintner  <[EMAIL PROTECTED]>  wrote:
>  Jeez  there's  always  something  new.  Anyone  know  about  this  (which  
> seems  at  a
>  glance  loosely  relevant  to  Ben's  approach)  ?
>
>  http://www.emergent-languages.org/
>
>  Overview
>
>  This  site  provides  an  introduction  to  the  research  on  emergent  and
>  evolutionary  languages  as  conducted  at  the  Sony  Computer  Science  
> Laboratory
>  in  Paris  and  the  AI-Lab  at  the  VUB  in  Brussels.  One  of  the  
> principle
>  objectives  of  this  research  is  to  identify  the  cognitive  
> capabilities  that
>  artificial  agents  must  posses  to  enable,  in  a  population  of  such  
> agents,  the
>  emergence  and  evolution  of  a  language  that  exhibits  characteristic  
> features
>  identified  in  natural  languages.
>
>  Looks  like  Sony-  Aibo-  financed.  Luc  Steels  seems  to  be  a  
> principal  figure.
>  This  is  quite  fun:
>
>  http://www.csl.sony.fr/~py/clickerTraining.htm
>
>  Here  he  explains/justifies  his  approach:
>
>  http://www.csl.sony.fr/downloads/papers/2006/steels-06a.pdf
>
>  And  how  did  I  get  to  all  this?  From,  tangentially,  Construction  
> Grammar,
>  which  is  yet  another  interesting  aspect  of  cognitive  linguistics:
>
>  http://en.wikipedia.org/wiki/Construction_grammar
>
>
>  -
>  This  list  is  sponsored  by  AGIRI:  http://www.agiri.org/email
>  To  unsubscribe  or  change  your  options,  please  go  to:
>  http://v2.listbox.com/member/?&;
>



-- 
Ben  Goertzel,  PhD
CEO,  Novamente  LLC  and  

Re: [agi] Emergent languages Org

2008-02-03 Thread Joseph Gentle
I doubt that 3D object recognition is integral to 'genuine
intelligence'. Theoretically, if we had an AGI we should be able to
put it in a simulated 2D world and it would still act intelligently.

IMO language is integral to strong AI in the same way that logic is
integral to mathematics. If you think about it, human languages are
basically higher order logics with fuzzy expressions, probabilities
and context. That sounds like a fabulous description of the higher
order logic our brains use internally to store thoughts.

I haven't read any of Steels stuff lately, either. I'm not sure if any
of the language he's generating is higher order, but I wouldn't be so
quick to dismiss emergent language generation as a trick for just 5
minute demos.

-J

On Feb 4, 2008 12:34 AM, Bob Mottram <[EMAIL PROTECTED]> wrote:
> I havn't read any of Luc Steels stuff for a long time, but he has been
> researching the evolution of language using robots or software agents
> since the early 1990s.  This is really a symbol grounding problem
> where the communication in some way needs to represent things or
> situations which the agent can perceive with its sensors.
>
> Some years ago I tried to do something similar to Pierre Oudeyers
> video using a humanoid robot - presenting objects and saying "this is
> a..." or "what is this?" or "Is this a...?".  I didn't go very far
> down this route because I found that visual recognition of objects
> constitutes the major part of the problem.  It is possible to use SIFT
> features and geometric hashes (which I think is what the AIBO robot is
> doing in this demo) but these 2D methods just aren't very good on
> objects with complicated 3D shapes.  Since I'm interested in making
> machines which are genuinely intelligent, as opposed to appearing to
> be intelligent in a five minute demo, I've spent most of my efforts on
> the 3D object recognition problem.  It turns out that other things are
> fundamentally related to this problem, such as mapping, navigation and
> SLAM.
>
>
>
>
> On 03/02/2008, Mike Tintner <[EMAIL PROTECTED]> wrote:
> > Jeez there's always something new. Anyone know about this (which seems at a
> > glance loosely relevant to Ben's approach) ?
> >
> > http://www.emergent-languages.org/
> >
> > Overview
> >
> > This site provides an introduction to the research on emergent and
> > evolutionary languages as conducted at the Sony Computer Science Laboratory
> > in Paris and the AI-Lab at the VUB in Brussels. One of the principle
> > objectives of this research is to identify the cognitive capabilities that
> > artificial agents must posses to enable, in a population of such agents, the
> > emergence and evolution of a language that exhibits characteristic features
> > identified in natural languages.
> >
> > Looks like Sony- Aibo- financed. Luc Steels seems to be a principal figure.
> > This is quite fun:
> >
> > http://www.csl.sony.fr/~py/clickerTraining.htm
> >
> > Here he explains/justifies his approach:
> >
> > http://www.csl.sony.fr/downloads/papers/2006/steels-06a.pdf
> >
> > And how did I get to all this? From, tangentially, Construction Grammar,
> > which is yet another interesting aspect of cognitive linguistics:
> >
> > http://en.wikipedia.org/wiki/Construction_grammar
> >
> >
> > -
> > This list is sponsored by AGIRI: http://www.agiri.org/email
> > To unsubscribe or change your options, please go to:
> > http://v2.listbox.com/member/?&;
> >
>
> -
> This list is sponsored by AGIRI: http://www.agiri.org/email
> To unsubscribe or change your options, please go to:
> http://v2.listbox.com/member/?&;
>

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=93269357-79b910


Re: [agi] Emergent languages Org

2008-02-03 Thread Ben Goertzel
> IMO language is integral to strong AI in the same way that logic is
> integral to mathematics.

The counterargument is that no one has yet made an AI virtual chimp ...
and nearly all of the human brain is the same as that of a chimp ...

I think that language-centric approaches are viable, but I wouldn't dismiss
sensorimotor-centric approaches to AGI either ... looking at evolutionary
history, it seems that ONE way to achieve linguistic functionality is via
some relatively minor tweaks on a prelinguistic mind tuned for flexible
sensorimotor learning... (tho I don't believe this is the only way, unlike some)

-- Ben

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=93270421-5eade1


Re: [agi] Emergent languages Org

2008-02-03 Thread Joseph Gentle
On Feb 4, 2008 11:27 AM, Ben Goertzel <[EMAIL PROTECTED]> wrote:
> > IMO language is integral to strong AI in the same way that logic is
> > integral to mathematics.
>
> The counterargument is that no one has yet made an AI virtual chimp ...
> and nearly all of the human brain is the same as that of a chimp ...
>
> I think that language-centric approaches are viable, but I wouldn't dismiss
> sensorimotor-centric approaches to AGI either ... looking at evolutionary
> history, it seems that ONE way to achieve linguistic functionality is via
> some relatively minor tweaks on a prelinguistic mind tuned for flexible
> sensorimotor learning... (tho I don't believe this is the only way, unlike 
> some)
>
> -- Ben


Interesting! You make a very good point.

I'd be interested in what you see as the path from SLAM to AGI.

To me, language generation seems obvious: 1. Make a language and
algorithms for generating stuff in that language. 2. Implement pattern
recognition and abstraction (imo not _that_ hard if you've designed
your language well) 3. Ground the language through real-world
sensorimotor experiences so the utterances mirror the agents'
experiences.

What do you see as the equivalent path from mapping, navigation and SLAM?

-J

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=93271502-5143d2


Re: [agi] Emergent languages Org

2008-02-03 Thread Ben Goertzel
Hi,

> I'd be interested in what you see as the path from SLAM to AGI.
>
> To me, language generation seems obvious: 1. Make a language and
> algorithms for generating stuff in that language. 2. Implement pattern
> recognition and abstraction (imo not _that_ hard if you've designed
> your language well) 3. Ground the language through real-world
> sensorimotor experiences so the utterances mirror the agents'
> experiences.
>
> What do you see as the equivalent path from mapping, navigation and SLAM?

Mapping, navigation and SLAM are not the key point -- embodied learning is
the point ... these are just prerequisites...

The robotics path to AI is a lot like the evolutionary path to natural
intelligence...

Create a system that learns to achieve simple sensorimotor goals in
its environment...
then move on to social goals... and language eventually emerges as an aspect of
social interaction...

Rather than language being a separate thing that is then grounded in experience,
make language **emerge** from nonlinguistic interactions ... as it
happened historically

See Mithen's The Singing Neanderthals for ideas about how language may
have emerged
from prelinguistic sound-making ... and a host of researchers for
ideas about how language
may have emerged from gesture (I have a paper touching on the latter
at novamente.net/papers )

-- Ben G

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=93273630-9e8239


Re: [agi] Emergent languages Org

2008-02-03 Thread Joseph Gentle
On Feb 4, 2008 12:12 PM, Ben Goertzel <[EMAIL PROTECTED]> wrote:
> The robotics path to AI is a lot like the evolutionary path to natural
> intelligence...
>
> Create a system that learns to achieve simple sensorimotor goals in
> its environment...
> then move on to social goals... and language eventually emerges as an aspect 
> of
> social interaction...

You might be right, but I'm very skeptical. I don't see how any
complex behaviour can simply 'emerge' from strict algorithms like
SLAM. SLAM as I know it doesn't allow for emergent behavior at all
(except for its explicit mapping ability).

Eventually, you will have to write something which allows for emergent
behaviour and complex communication. To me, that stage of your project
is the interesting crux of AGI. It should have some very interesting
emergant behaviour with inputs other than the information SLAM
outputs.

Why not just work on that difficult part now?

-J

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=93292780-7d3b9c


Re: [agi] Emergent languages Org

2008-02-04 Thread Bob Mottram
On 04/02/2008, Joseph Gentle <[EMAIL PROTECTED]> wrote:
> I haven't read any of Steels stuff lately, either. I'm not sure if any
> of the language he's generating is higher order, but I wouldn't be so
> quick to dismiss emergent language generation as a trick for just 5
> minute demos.

Well if you take something like the "talking heads" experiment
(http://www.isrl.uiuc.edu/~amag/langev/cited2/steelsthetalkingheadsexperiment.html)
and ask what it would take to scale this up to human-like language
abilities inevitably you're always drawn back to the fact that the
images used are of a trivial nature.  If the images which the cameras
were observing were natural scenes I doubt that talking heads (as it
existed in 1999/2000) would have been able to deliver meaningful
results.

There needs to be some kind of reliable pattern which you can
correlate your linguistics with.  Uncertainties can be dealt with, but
if the pattern is completely unreliable from one observation to the
next you're lost.  Simulation doesn't really deal with the problem, or
rather it deals with the problem by ignoring it.  In a simulation you
can take shortcuts which would never be possible in real life.  In the
real world objects do not come pre-labeled, and instead have to be
learned from experience which is ultimately delivered to us through
our senses.

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=93343682-8c6b7c


Re: [agi] Emergent languages Org

2008-02-04 Thread Joseph Gentle
On Feb 4, 2008 7:38 PM, Bob Mottram <[EMAIL PROTECTED]> wrote:
> Well if you take something like the "talking heads" experiment
> (http://www.isrl.uiuc.edu/~amag/langev/cited2/steelsthetalkingheadsexperiment.html)
> and ask what it would take to scale this up to human-like language
> abilities inevitably you're always drawn back to the fact that the
> images used are of a trivial nature.

Perhaps. However, I think there's at least as much work required to
take a robot (with localisation + mapping if you like) and scale it up
to communicate with human-like language.

> There needs to be some kind of reliable pattern which you can
> correlate your linguistics with.  Uncertainties can be dealt with, but
> if the pattern is completely unreliable from one observation to the
> next you're lost.  Simulation doesn't really deal with the problem, or
> rather it deals with the problem by ignoring it.

This is a very good point. Reliable patterns are important and dealing
with uncertainty in your patterns is critical to real-world
situations. That said though, it might be possible to make AGI without
a decent ability to deal with uncertainty then program that in later.
Its hard to tell. What do you guys think?

-J

-
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=93353273-22cd00