On Wed, Nov 26, 2014 at 8:50 PM, Tim Tyler via AGI <[email protected]> wrote:
> On 26/11/2014 19:03, Ben Goertzel via AGI wrote:
>> What's less clear is whether it's productive to view predictive as
>> **the core** cognitive functionality of human-level intelligence, as
>> Jeff Hawkins and others have suggested

You have 10^9 bits of knowledge programmed by evolution and another
10^9 bits learned from your environment after birth. The evolved
knowledge took 10^8 times longer to learn (3 billion years vs. 30
years).

Contrast this with text compression on the same scale for the top
ranked programs from http://mattmahoney.net/dc/text.html

About 99.9% of the knowledge is learned after birth (the size of the
compressed file) and 0.1% by the evolution of the software (the size
of the decompression program). The evolved knowledge took 10^4 to 10^5
times longer to learn than the statistical knowledge (years vs.
hours).

I am intimately familiar with the process of developing data
compression software. It is an iterative process. You think you have a
good idea of what changes ought to improve compression. But then you
do the experiment and you are right maybe less than half of the time.
Even if you are a fast coder, you can see that development time is
limited by the CPU power available to do the tests and gain a couple
bits of knowledge.

Optimal text compression implies passing the Turing test. But this
experiment (now 8.5 years old) suggests we are a very very long way
from optimizing the code.

-- 
-- Matt Mahoney, [email protected]


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to