On Monday 25 September 2006 16:48, Ben Goertzel wrote:
> My own view is that symbol grounding is not a waste of time ... but,
> **exclusive reliance** on symbol grounding is a waste of time.

It's certainly not a waste of time in the general sense, especially if you're 
going to be building a robot! But I just don't think it's on the critical 
path.

> Novamente utilizes a combination of grounding of symbols in
> simulated-embodied experience with ingestion of information from
> existing databases.  I believe this sort of combination is optimal,
> rather than purely relying on data sources with no attention to
> embodied experience....

You're definitely right if the databases you're thinking of are things like 
CYC or Wikipedia. General English text is shot thru with dependencies on 
physical/body understanding. But it's also shot thru with social/emotional 
dependencies which probably amount to 10 times as much primitive semantics.

> My own view is that all serious learning algorithms are inevitably
> going to scale exponentially -- so the whole art of AGI design is in
> figuring out appropriate tricks for making the exponent and the
> constant outside the exponential function "not too large" for problem
> classes of practical import...
>
Well, in some sense, Solomonoff/Levin search is general, but of course it 
blows up like a balloon. 
People like John McCarthy (and of course Solomonoff) talked about this kind of 
thing as far back as the '50s. EVERY attempt at a general learning algorithm 
has the same problem: it's exponential in the complexity of the structure 
created.
I don't think that just reducing the exponent makes a general learner. I think 
that what has to happen is that the system itself has to be able to learn the 
kind of things that can reduce the exponent -- so it can get more efficient 
as it goes. At any given point it looks exponential, but it somehow keeps 
finding "just one more trick" that lets it build ever-more sophisticated 
structures.

Josh

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to