I have been collaborating with this lab on their Fluid Construction Grammar 
system, as described briefly in this blog post: 
http://texai.org/blog/2007/10/24/fluid-construction-grammar

I downloaded their Common Lisp implementation and rewrote it in Java and 
demonstrated that I could achieve the same results as their Lisp 
implementation.  Then I extended it to parse incrementally, e.g. word-by-word, 
strictly left-to-right, creating semantics at each step.

I have not studied the theory of emerging languages as I am focused on what I 
think is their excellent production rule engine for bi-directional grammars.  I 
would be glad to provide an introduction on the linkedin network to Pieter 
Wellens, who is a PhD student at the associated VUB AI-Lab.

-Steve
 
Stephen L. Reed 
Artificial Intelligence Researcher
http://texai.org/blog
http://texai.org
3008 Oak Crest Ave.
Austin, Texas, USA 78704
512.791.7860

----- Original Message ----
From: Ben Goertzel <[EMAIL PROTECTED]>
To: agi@v2.listbox.com
Sent: Sunday, February 3, 2008 8:03:13 AM
Subject: Re: [agi] Emergent languages Org

 Thanks  for  the  references...

I  found  this  paper

Kaplan,  F.,  Oudeyer,  P-Y.,  Kubinyi,  E.  and  Miklosi,  A.  (2002)  Robotic
clicker  training,  Robotics  and  Autonomous  Systems,  38(3-4),  pp.
197--206.

at  (near  the  bottom)

http://www.csl.sony.fr/~py/clickerTraining.htm

interesting  in  terms  of  highlighting  the  difference  btw  virtual-world
and  physical-robotics  teaching  of  agents,  as  well  as  the  basic
difference  between  Novamente's  Virtual  Animal  Brain  system  and  real
dog  brains...

They  point  out  that  imitation  learning  is  rarely  used  for  teaching
animals,  both  because  animals  are  bad  at  imitation,  and  because  of
differences  between  human  and  animal  anatomy.

However,  Novamente  is  good  at  imitation,  and  in  a  virtual-world
context  the  differences  between  human  and  animal  anatomy  can  be
finessed  pretty  easily  (via  simply  supplying  the  virtual  animal  with
suggestions  about  how  to  map  specific  human-avatar  animations  into
specific  animal-avatar  animations).

What  they  advocate  in  the  paper,  for  teaching  robots,  is  "clicker
training"  which  is  basically  Skinnerian  reinforcement  learning  with  a
judicious,  time-variant  sequence  of  partial  rewards.   At  first  you
reward  the  animal  for  doing  1/10  of  the  behavior  right,  then  after  
it
can  do  that,  you  reward  it  for  doing  2/10  of  the  behavior  right,  
etc.

In  their  work  on  language  learning

http://www.csl.sony.fr/~py/languageAcquisition.htm

I  see  nothing  coming  remotely  close  to  a  discussion  of  the  learning  
of
syntax  or  complex  semantics  ...  what  I  see  is  some  experiments  in
which  robots  learned,  through  spontaneous  exploration  and
reinforcement,  the  simple  fact  that  vocalizing  toward  other  agents  is
a  useful  thing  to  do.   Which  is  certainly  interesting  ...  but  it's
really  just  a  matter  of  "learning  THAT  vocal  communication  exists",  in
a  setting  where  not  that  many  other  possibilities  exist...

--  Ben  G


On  Feb  3,  2008  7:08  AM,  Mike  Tintner  <[EMAIL PROTECTED]>  wrote:
>  Jeez  there's  always  something  new.  Anyone  know  about  this  (which  
> seems  at  a
>  glance  loosely  relevant  to  Ben's  approach)  ?
>
>  http://www.emergent-languages.org/
>
>  Overview
>
>  This  site  provides  an  introduction  to  the  research  on  emergent  and
>  evolutionary  languages  as  conducted  at  the  Sony  Computer  Science  
> Laboratory
>  in  Paris  and  the  AI-Lab  at  the  VUB  in  Brussels.  One  of  the  
> principle
>  objectives  of  this  research  is  to  identify  the  cognitive  
> capabilities  that
>  artificial  agents  must  posses  to  enable,  in  a  population  of  such  
> agents,  the
>  emergence  and  evolution  of  a  language  that  exhibits  characteristic  
> features
>  identified  in  natural  languages.
>
>  Looks  like  Sony-  Aibo-  financed.  Luc  Steels  seems  to  be  a  
> principal  figure.
>  This  is  quite  fun:
>
>  http://www.csl.sony.fr/~py/clickerTraining.htm
>
>  Here  he  explains/justifies  his  approach:
>
>  http://www.csl.sony.fr/downloads/papers/2006/steels-06a.pdf
>
>  And  how  did  I  get  to  all  this?  From,  tangentially,  Construction  
> Grammar,
>  which  is  yet  another  interesting  aspect  of  cognitive  linguistics:
>
>  http://en.wikipedia.org/wiki/Construction_grammar
>
>
>  -----
>  This  list  is  sponsored  by  AGIRI:  http://www.agiri.org/email
>  To  unsubscribe  or  change  your  options,  please  go  to:
>  http://v2.listbox.com/member/?&;
>



-- 
Ben  Goertzel,  PhD
CEO,  Novamente  LLC  and  Biomind  LLC
Director  of  Research,  SIAI
[EMAIL PROTECTED]

"If  men  cease  to  believe  that  they  will  one  day  become  gods  then  
they
will  surely  become  worms."
--  Henry  Miller

-----
This  list  is  sponsored  by  AGIRI:  http://www.agiri.org/email
To  unsubscribe  or  change  your  options,  please  go  to:
http://v2.listbox.com/member/?&;







      
____________________________________________________________________________________
Be a better friend, newshound, and 
know-it-all with Yahoo! Mobile.  Try it now.  
http://mobile.yahoo.com/;_ylt=Ahu06i62sR8HDtDypao8Wcj9tAcJ 

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=93208591-d770cb

Reply via email to