I'm considering nonmonotonic reasoning using Bayes net, and got stuck.

There is an example on p483 of J Pearl's 1988 book PRIIS:

Given:
"birds can fly"
"penguins are birds"
"penguins cannot fly"

The desiderata is to conclude that "penguins are birds, but penguins
cannot fly".

Pearl translates the KB to:
   P(f | b) = high
   P(f | p) = low
   P(b | p) = high
where high and low means arbitrarily close to 1 and 0, respectively.

If you draw this on paper you'll see a triangular loop.

Then Pearl continues to deduce:

Conditioning P(f | p) on both b and ~b,
    P(f | p) = P(f | p,b) P(b | p) + P(f | p,~b) [1-P(b | p)]
                > P(f | p,b) P(b | p)

Thus
    P(f | p,b) < P(f | p) / P(b | p) which is close to 0.

Thus Pearl concludes that "given penguin and bird, fly is not true".

But I found something wrong here.  It seems that the Bayes net is
loopy and we can conclude that "fly" given "penguin" and "bird" can be
either 0 or 1.  (The loop is somewhat symmetric).

Ben, do you have a similar problem dealing with nonmonotonicity using
probabilistic networks?

YKY


-------------------------------------------
agi
Archives: http://www.listbox.com/member/archive/303/=now
RSS Feed: http://www.listbox.com/member/archive/rss/303/
Modify Your Subscription: 
http://www.listbox.com/member/?member_id=8660244&id_secret=106510220-47b225
Powered by Listbox: http://www.listbox.com

Reply via email to