Though there is a "loop", YKY's problem not is caused by circular
inference, but by "multiple Inheritances", that is, different
inference paths give different conclusions. This is indeed a problem
in Bayes net, and there is no general solution in that theory, except
in special cases.

This problem is solved in NARS mainly by the confidence measurement,
though inference trails are also relevant.

See my "Reference Classes and Multiple Inheritances" at
http://www.cogsci.indiana.edu/farg/peiwang/papers.html#reference_classes

Pei

On Fri, Jul 4, 2008 at 11:00 PM, Ben Goertzel <[EMAIL PROTECTED]> wrote:
> YKY,
>
> PLN, like NARS, uses inference trails
>
> Although we have tried omitting them, and found interesting results:
> errors do propagate, but not boundlessly, and network truth values are
> still meaningful
>
> Loopy Bayes nets basically "just live with the circularity" and rely
> on math properties of the Bayes net propagation rules to remove the
> possibility of error.  Nice stuff, but it only works under fairly
> special assumptions.
>
> Traditional Bayes nets just assume a hierarchical structure and ignore
> the conditional probs not in accordance w/ the hierarchy, getting at
> them only indirectly via the ones in the hierarchy.  This is why
> structure learning is so important in Bayes nets.
>
> -- Ben
>
>
> On Fri, Jul 4, 2008 at 4:10 AM, YKY (Yan King Yin)
> <[EMAIL PROTECTED]> wrote:
>> I'm considering nonmonotonic reasoning using Bayes net, and got stuck.
>>
>> There is an example on p483 of J Pearl's 1988 book PRIIS:
>>
>> Given:
>> "birds can fly"
>> "penguins are birds"
>> "penguins cannot fly"
>>
>> The desiderata is to conclude that "penguins are birds, but penguins
>> cannot fly".
>>
>> Pearl translates the KB to:
>>   P(f | b) = high
>>   P(f | p) = low
>>   P(b | p) = high
>> where high and low means arbitrarily close to 1 and 0, respectively.
>>
>> If you draw this on paper you'll see a triangular loop.
>>
>> Then Pearl continues to deduce:
>>
>> Conditioning P(f | p) on both b and ~b,
>>    P(f | p) = P(f | p,b) P(b | p) + P(f | p,~b) [1-P(b | p)]
>>                > P(f | p,b) P(b | p)
>>
>> Thus
>>    P(f | p,b) < P(f | p) / P(b | p) which is close to 0.
>>
>> Thus Pearl concludes that "given penguin and bird, fly is not true".
>>
>> But I found something wrong here.  It seems that the Bayes net is
>> loopy and we can conclude that "fly" given "penguin" and "bird" can be
>> either 0 or 1.  (The loop is somewhat symmetric).
>>
>> Ben, do you have a similar problem dealing with nonmonotonicity using
>> probabilistic networks?
>>
>> YKY
>>
>>
>> -------------------------------------------
>> agi
>> Archives: http://www.listbox.com/member/archive/303/=now
>> RSS Feed: http://www.listbox.com/member/archive/rss/303/
>> Modify Your Subscription: http://www.listbox.com/member/?&;
>> Powered by Listbox: http://www.listbox.com
>>
>
>
>
> --
> Ben Goertzel, PhD
> CEO, Novamente LLC and Biomind LLC
> Director of Research, SIAI
> [EMAIL PROTECTED]
>
> "Nothing will ever be attempted if all possible objections must be
> first overcome " - Dr Samuel Johnson
>
>
> -------------------------------------------
> agi
> Archives: http://www.listbox.com/member/archive/303/=now
> RSS Feed: http://www.listbox.com/member/archive/rss/303/
> Modify Your Subscription: http://www.listbox.com/member/?&;
> Powered by Listbox: http://www.listbox.com
>


-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=106510220-47b225
Powered by Listbox: http://www.listbox.com

Reply via email to