Ben, Jim I am new in the list, and I am very happy learning everyday from you!
Could you point me to references about why Solomonoff induction is not important for AGI? And viceversa, about Bayesian been important? What other approaches are (maybe aixi)? I know about the limitations (at a very basic level) of the approaches, but I would like to learn more. Thanks El dic. 2, 2015 16:02, "Jim Bromer" <[email protected]> escribió: > Ben Goertzel <[email protected]> wrote: > Regarding Solomonoff induction, it's the conceptual basis underlying any > work done in machine learning that uses a simplicity bias (e.g. MOSES with > an Occam bias ... any Minimum Description Length work, etc.). I don't > think it's the golden path to AGI but it's certainly relevant. > > > > The combination of a compression method with a learning and recognition > method is interesting. I think that is something that is needed. But, I do > not think that Solomonoff induction or neural networks are strong enough. > Solomonoff Induction is a general method to compress narrow reference > objects. Ok, if it were a feasible AGI method that might mean that other > applications of the reference could be used to refer to greater collections > of relevant material but there are obvious contradictions that supposition. > > It is as if you can understand what I am saying but just refuse to > spend the time to think about it. > Jim Bromer > *AGI* | Archives <https://www.listbox.com/member/archive/303/=now> > <https://www.listbox.com/member/archive/rss/303/27612553-932623ba> | > Modify > <https://www.listbox.com/member/?&> > Your Subscription <http://www.listbox.com> > ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
