One thought from me: throw away NL for representations in an intelligent
system, unless there's a mind that does understand it completely - e.g. one
that can unroll and map any ambiguous vague NL item (as long as it falls in
its cognitive reach) down to sensory data specifics or/and
theory/implementation details; and back.


>Date: Tue, 2 Apr 2013 10:07:18 -0700
> Subject: [agi] What is "understanding"?
> From: [email protected]
> To: [email protected]

> Aaron, et al,

> Recent discussions regarding representation brings up an even more
fundamental question - what is "understanding".
>(...)

>I am NOT looking for vague wishy-washy words. I am looking for a solid
definition that defines the outer boundary of
 >"understanding", enough to guide efforts to define a good representation
for it.
>Any thoughts?


-- 
===* Todor "Tosh" Arnaudov ===*
*
.... Twenkid Research:*  http://research.twenkid.com

.... *Author of the world first University courses in AGI  (2010, 2011)*:
http://artificial-mind.blogspot.com/2010/04/universal-artificial-intelligence.html

*.... Todor Arnaudov's Researches Blog**: *
http://artificial-mind.blogspot.com
*
*



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to