Ben Goertzel <[email protected]> wrote: Regarding Solomonoff induction, it's the conceptual basis underlying any work done in machine learning that uses a simplicity bias (e.g. MOSES with an Occam bias ... any Minimum Description Length work, etc.). I don't think it's the golden path to AGI but it's certainly relevant.
I get a little carried away at times. I don't mean to sound like I understand everything and know better, but my guess is if you were able to actually show that machine learning using a simplicity basis was able to achieve something in AI it would owe its efficacy to the implementation of a network learning method (perhaps a Bayesian like Network) and the hybridization (implemented in machine learning) and not the simplicity rule. As I said before, the Bayes rule might not be considered as fundamental to AGI either but in an implementation like a Bayesian Network the action of the system is not going to be purely Bayesian anyway. Jim Bromer On Wed, Dec 2, 2015 at 11:23 AM, Ben Goertzel <[email protected]> wrote: > > > On Wed, Dec 2, 2015 at 5:36 PM, Jim Bromer <[email protected]> wrote: > >> Ben, Your reply about the makes sense but you should have somehow made it >> clearer that you were making your choices based on some subjective reasons. >> I do not think that Solomonoff's methods can be used as a basis for AI and >> there is no way that you can demonstrate that it has been. Bayesian >> methods, on the other hand, have been demonstrated to be reasonable basis >> for AI. If I was spending the time to write an article like that I would be >> able to provide some substantial basis defending my point of view. >> > > I could provide a "substantial basis" underlying the choices I made in > that article, but the nature of such an article is that it has to be brief > and can't contain the justification underlying each point mentioned... > > Regarding Solomonoff induction, it's the conceptual basis underlying any > work done in machine learning that uses a simplicity bias (e.g. MOSES with > an Occam bias ... any Minimum Description Length work, etc.). I don't > think it's the golden path to AGI but it's certainly relevant. The > relatively substantial amount of space devoted to the topic in that > Scholarpedia article is somewhat correlated with the preferences of the > article's referees, though, to be honest ;p ... > > -- Ben > > > *AGI* | Archives <https://www.listbox.com/member/archive/303/=now> > <https://www.listbox.com/member/archive/rss/303/24379807-653794b5> | > Modify > <https://www.listbox.com/member/?&> > Your Subscription <http://www.listbox.com> > ------------------------------------------- AGI Archives: https://www.listbox.com/member/archive/303/=now RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424 Modify Your Subscription: https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657 Powered by Listbox: http://www.listbox.com
