On Mon, Jan 27, 2020, 12:04 PM <[email protected]> wrote:

> I see the Hutter Prize is a separate contest from Matt's contest/rules:
> http://mattmahoney.net/dc/textrules.html
>

Marcus Hutter and I couldn't agree on the details of the contest, which is
why there are two almost identical contests.

He is offering prize money, so I understand the need for strict hardware
restrictions (1 MB RAM and 8 hours x 2.2 GHz to extract 100 MB of text) to
make the contest fair and accessible. But I think this is unrealistic for
AGI. The human brain takes 20 years to process 1 GB of language, which is
10^25 operations on 6 x 10^14 synapses.

The first main result of my 12 years of testing 1000+ versions of 200
compressors is that compression (as a measure of prediction accuracy or
intelligence) increases with the log of computing time and the log of
memory (and probably the log of code complexity, which I didn't measure).
The best way to establish this relationship is to test over as wide a range
as possible by removing time and hardware restrictions. The top ranked
program (cmix) requires 32 GB of RAM and takes a week, which is about a
million times more time and memory than the fastest programs. But it is
still a billion times faster and uses 100,000 times less memory than a
human brain sized neural network.

The other main result is that the most effective text compression
algorithms are based on neural networks that model human language learning
(lexical, semantics, and grammar in that order). But the grammatical
modeling is rudimentary and probably requires a lot more hardware to model
properly.


------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T65747f0622d5047f-M5e6922e62911859156b660fd
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to