1)
While in my own AI projects I am currently gravitating toward an approach
involving virtual-worlds grounding, as a general rule I don't think it's obvious
that sensorimotor grounding is needed for AGI.  Certainly it's very useful, but
there is no strong argument that it's required.  The human path to AGI is not
the only one.

2)
I think that, potentially, building a KB could be part of an approach to
"solving the grounding problem."  Encode some simple knowledge, instruct
the system in how to ground it in its sensorimotor experience ... then encode
some more (slightly more complex) knowledge ... etc.   I'm not saying this is
the best way but it seems a viable approach.  Thus, even if you want to take
a grounding-focused approach, it doesn't follow that fully solving the grounding
problem must precede the creation and utilization of a KB.  Rather, there could
be a solution to the grounding problem that couples a KB with other aspects.


In the NM approach, we could proceed with or without a KB, and with or
without sensorimotor grounding; and I believe NARS has that same property...

My feeling is that sensorimotor grounding is an "Extremely Nice to Have"
whereas a KB is just a "Sort of Nice to Have", but I don't have a rigorous
demonstration of that....

-- Ben G


On Feb 17, 2008 11:30 AM, Russell Wallace <[EMAIL PROTECTED]> wrote:
> On Feb 17, 2008 3:34 PM, Pei Wang <[EMAIL PROTECTED]> wrote:
> > As Lukasz just pointed out, there are two topics:
> > 1. Cyc as an AGI project
> > 2. Cyc as a knowledge base useful for AGI systems.
>
> Well, I'm talking about Cyc (and similar systems) as useful for
> anything at all (other than experience to tell us what doesn't work
> and why not). But if it's proposed that such a system might be a
> useful knowledge base for something, then the something will have to
> have solved the grounding problem, right? And what I'm saying is, I
> wouldn't start off building a Cyc-like knowledge base and assume the
> grounding problem will be solved later. I'd start off with the
> grounding problem.
>
>
> -------------------------------------------
> agi
> Archives: http://www.listbox.com/member/archive/303/=now
> RSS Feed: http://www.listbox.com/member/archive/rss/303/
> Modify Your Subscription: http://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
>



-- 
Ben Goertzel, PhD
CEO, Novamente LLC and Biomind LLC
Director of Research, SIAI
[EMAIL PROTECTED]

"If men cease to believe that they will one day become gods then they
will surely become worms."
-- Henry Miller

-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=95818715-a78a9b
Powered by Listbox: http://www.listbox.com

Reply via email to