2008/12/20 Derek Zahn <derekz...@msn.com>:
>
> And yet, in your paper (which I enjoyed), you emphasize the importance of
> not providing
> a simplistic environment (with the screwdriver example).  Without facing the
> low-level
> sensory world (either through robotics or through very advanced simulations
> feeding
> senses essentially equivalent to those of humans), I wonder if a targeted
> "human-like"
> AGI will be able to acquire the necessary concepts that children absorb and
> use as much o
> f the metaphorical basis for their thought -- slippery, soft, hot, hard,
> rough, sharp, and on and on.

Evolution has equipped humans (and other animals) have a good
intuitive understanding of many of the physical realities of our
world. The real world is not just slippery in the physical sense, it's
slippery in the non-literal sense too. For example, I can pick up an
OXO cube (a solid object), crush it so it become powder, pour it into
my stew, and stir it in so it dissolves. My mind can easily and
effortlessly track that in some sense its the same oxo cube and in
another sense it isn't.

Another example: my cat can distinguish between surfaces that are safe
to sit on, and others that are too wobbly, even if they look the same.

An animals intuitive physics is a complex system. I expect that in
humans a lot of this machinery isd re-used to create intelligence. (It
may be true, and IMO probably is true, that it's not necessary to
re-create this machinery to make an AGI).


-- 
Philip Hunt, <cabala...@googlemail.com>
Please avoid sending me Word or PowerPoint attachments.
See http://www.gnu.org/philosophy/no-word-attachments.html


-------------------------------------------
agi
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=8660244&id_secret=123753653-47f84b
Powered by Listbox: http://www.listbox.com

Reply via email to