I tend to agree with Richard's view and I may build an AGI with symbolic, non-numerical inference.
 
1.  As Russell pointed out, if the priors are not known or are in extremely low precision, Bayes rule is not very applicable.  Number crunching with priors of 1-2 bits precision is "garbage in, garbage out".
 
2.  It seems that in the majority of situations, priors are of 1-2 bits precision.
 
3.  To put it another way, it seems that in most real situations, the quality of an inference is often greatly improved by taking into account more facts / contexts, much more so than by increasing the precision of probabilities of a smaller number of facts.
 
4.  Even worse, this seems to be a fundamental feature of reality.  Can an AGI increase the precision of its internal probabilities by continually updating them?
 
5.  The answer seems to be negative.  Real events are dependent on a lot of other events in a complex way.  They usually do not repeat again and again under the same conditions.
 
6.  Priors can be known with great precision in 2 cases: 1) the outcomes are enumerable and equi-probable, as in rolling a dice; 2) the event have repeated many times under identical conditions.  It seems that the majority of events are not like these.
 
YKY

To unsubscribe, change your address, or temporarily deactivate your subscription, please go to http://v2.listbox.com/member/[EMAIL PROTECTED]

Reply via email to