> -----Original Message-----
> From: Matt Mahoney [mailto:[email protected]]
> 
> Here is a draft of a paper I am working on. I would appreciate any
> comments. You might find the content somewhat controversial.
> 

This is a big statement:

"AI requires both a brain and a body. Therefore, we should expect its
algorithmic (Kolmogorov) complexity to be similar to that of a human."

Though the word "similar" leaves much open for interpretation.

I don't know if anyone knows anywhere near what the minimal K-complexity is
for running general intelligence. We know the approximate minimal known
K-complexly for human intelligence, and that is from us. 

Some people think the minimal k-complexity for GI is quite a bit smaller
than that of a human's. I would think it would have to be... But then there
is the start K-complexity tabula rasa at human birth before it gets filled
with information... similar to AGI I suppose. Though some AGI designs imply
general intelligence being achieved after it runs for a while, not at the
getgo.

John








-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to