Ben Goertzel <[email protected]> wrote:
Regarding Solomonoff induction, it's the conceptual basis underlying any
work done in machine learning that uses a simplicity bias (e.g. MOSES with
an Occam bias ... any Minimum Description Length work, etc.).   I don't
think it's the golden path to AGI but it's certainly relevant.



The combination of a compression method with a learning and recognition
method is interesting. I think that is something that is needed. But, I do
not think that Solomonoff induction or neural networks are strong enough.
Solomonoff Induction is a general method to compress narrow reference
objects. Ok, if it were a feasible AGI method that might mean that other
applications of the reference could be used to refer to greater collections
of relevant material but there are obvious contradictions that supposition.

It is as if you can understand what I am saying but just refuse to
spend the time to think about it.
Jim Bromer



-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to