Linas Vepstas wrote:
On Tue, Nov 13, 2007 at 12:34:51PM -0500, Richard Loosemore wrote:
Suppose that in some significant part of Novamente there is a representation system that uses "probability" or "likelihood" numbers to encode the strength of facts, as in [I like cats](p=0.75). The (p=0.75) is supposed to express the idea that the statement [I like cats] is in some sense "75% true".

Either way, we have a problem: a fact like [I like cats](p=0.75) is ungrounded because we have to interpret it. Does it mean that I like cats 75% of the time? That I like 75% of all cats? 75% of each cat? Are the cats that I like always the same ones, or is the chance of an individual cat being liked by me something that changes? Does it mean that I like all cats, but only 75% as much as I like my human family, which I like(p=1.0)? And so on and so on.

Eh?

You are standing at the proverbial office water coooler, and Aneesh says "Wen likes cats". On your drive home, you mind races .. does this
mean that Wen is a cat fancier?  You were planning on taking Wen out
on a date, and this tidbit of information could be useful ...
when you try to build the entire grounding mechanism(s) you are forced to become explicit about what these numbers mean, during the process of building a grounding system that you can trust to be doing its job: you cannot create a mechanism that you *know* is constructing sensible p numbers and facts during all of its development *unless* you finally bite the bullet and say what the p numbers really mean, in fully cashed out terms.

But has a human, asking Wen out on a date, I don't really know what "Wen likes cats" ever really meant. It neither prevents me from talking to Wen, or from telling my best buddy that "...well, I know, for instance, that she likes cats..." Lack of grounding is what makes humour funny, you can do a whole Pygmalion / Seinfeld episode on "she likes cats".

No: the real concept of "lack of grounding" is nothing so simple as the way you are using the word "grounding".

Lack of grounding makes an AGI fall flat on its face and not work.

I can't summarize the grounding literature in one post. (Though, heck, I have actually tried to do that in the past: didn't do any good).



Richard Loosemore

-----
This list is sponsored by AGIRI: http://www.agiri.org/email
To unsubscribe or change your options, please go to:
http://v2.listbox.com/member/?member_id=8660244&id_secret=64980585-67cbc9

Reply via email to