On Sun, Dec 9, 2012 at 9:32 AM, John G. Rose <[email protected]> wrote:
>> -----Original Message-----
>> From: Matt Mahoney [mailto:[email protected]]
>>
>> Here is a draft of a paper I am working on. I would appreciate any
>> comments. You might find the content somewhat controversial.
>>
>
> This is a big statement:
>
> "AI requires both a brain and a body. Therefore, we should expect its
> algorithmic (Kolmogorov) complexity to be similar to that of a human."
>
> Though the word "similar" leaves much open for interpretation.

I mean close numerically.

> I don't know if anyone knows anywhere near what the minimal K-complexity is
> for running general intelligence. We know the approximate minimal known
> K-complexly for human intelligence, and that is from us.

We didn't, which is why we had monstrous failures like Cyc.

> Some people think the minimal k-complexity for GI is quite a bit smaller
> than that of a human's.

Yes, some people still think that.

> I would think it would have to be... But then there
> is the start K-complexity tabula rasa at human birth before it gets filled
> with information... similar to AGI I suppose. Though some AGI designs imply
> general intelligence being achieved after it runs for a while, not at the
> getgo.

There is the complexity of a single baby brain, the single adult
brain, and the collective brains of humanity. Some people call the
first or second case AGI and declare the problem solved. But what
needs to be done is immensely harder.


-- Matt Mahoney, [email protected]


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to