Pei,

I have a different sort of reason for thinking embodiment is important ...
it's a deeper reason that I think underlies the "embodiment is important
because of symbol grounding" argument.

Linguistic data, mathematical data, visual data, motoric data etc. are all
just bits ... and intelligence needs to work by recognizing patterns among
these bits, especially patterns related to system goals.

What I think is that the set of patterns in perceptual and motoric data has
radically different statistical properties than the set of patterns in
linguistic and mathematical data ... and that the properties of the set of
patterns in perceptual and motoric data is intrinsically better suited to
the needs of a young, ignorant, developing mind.

All these different domains of pattern display what I've called a "dual
network" structure ... a collection of hierarchies (of progressively more
and more complex, hierarchically nested patterns) overlayed with a
heterarchy (of overlapping, interrelated patterns).  But the statistics of
the dual networks in the different domains is different.  I haven't fully
plumbed the difference yet ... but, among the many differences is that in
perceptual/motoric domains, you have a very richly connected dual network at
a very low level of the overall dual network hierarchy -- i.e., there's a
richly connected web of relatively simple stuff to understand ... and then
these simple things are related to (hence useful for learning) the more
complex things, etc.

In short, Pei, I agree that the arguments typically presented in favor of
embodiment in AI suck.  However, I think there are deeper factors going on
which do imply a profound value of embodiment for AGI.  Unfortunately, we
currently lack a really appropriate scientific language for describing the
differences in statistical organization between different pattern-sets, so
it's almost as difficult to articulate these differences as it is to
understand them...

-- Ben G

On Wed, Sep 3, 2008 at 4:58 PM, Pei Wang <[EMAIL PROTECTED]> wrote:

> TITLE: Embodiment: Who does not have a body?
>
> AUTHOR: Pei Wang
>
> ABSTRACT: In the context of AI, ``embodiment'' should not be
> interpreted as ``giving the system a body'', but as ``adapting to the
> system's experience''. Therefore, being a robot is neither a
> sufficient condition nor a necessary condition of being embodied. What
> really matters is the assumption about the environment for which the
> system is designed.
>
> URL: http://nars.wang.googlepages.com/wang.embodiment.pdf
>
>
> -------------------------------------------
> agi
> Archives: https://www.listbox.com/member/archive/303/=now
> RSS Feed: https://www.listbox.com/member/archive/rss/303/
> Modify Your Subscription:
> https://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
>



-- 
Ben Goertzel, PhD
CEO, Novamente LLC and Biomind LLC
Director of Research, SIAI
[EMAIL PROTECTED]

"Nothing will ever be attempted if all possible objections must be first
overcome " - Dr Samuel Johnson



-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=111637683-c8fa51
Powered by Listbox: http://www.listbox.com

Reply via email to