[EMAIL PROTECTED] writes:
 
> But it should be quite clear that such methods could eventually be very handy 
> for AGI.
 
I agree with your post 100%, this type of approach is the most interesting 
AGI-related stuff to me.
 
> An audiovisual perception layer generates semantic interpretation on the > 
> (sub)symbolic level. How could a symbolic engine ever reason about the real > 
> world without access to such information?
 
Even more interesting:  How could a symbolic engine ever reason about the real 
world *with* access to such information? :)

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=98558129-0bdb63
Powered by Listbox: http://www.listbox.com

Reply via email to