On Mon, Feb 3, 2020, 8:34 AM stefan.reich.maker.of.eye via AGI <
[email protected]> wrote:

> > That is why text compression is a test for AI and solving it solves AI.
>
> I challenge that second assumption. What does "solving text compression"
> mean? I don't understand what that is supposed to be.
>

It means predicting text as well as a human. Shannon estimated in 1950 that
English text has an entropy of 0.6 to 1.3 bits per character. Common
compressors like zip, rar, and 7zip compress to 2-3 bpc. The best
compressors get 1.0 bpc. I can't say if it's solved or not because it's
hard to improve on Shannon's estimate in spite of being so old. He used
character guessing tests, but this only gives you a range because different
distributions can give you the same results.


> Also, I'd still like to hear how you would create any kind of
> general-purpose AI system out of a compressor.
>

Oh, that's simple. A compressor consists of a predictor and a coder
(arithmetic or ANS). You take out the coder and use the predictor to
predict answers in a Turing test. (People have done that with PAQ, and it
would most likely fail after a few words).

It's actually more constrained to go the other way because the predictor
has to be deterministic in order to calculate exactly the same probability
distribution during decompression.

If my claim seems wrong, it's because the brain can't reproduce a
prediction sequence exactly and it lacks an encoder. But these are both
easy problems for computers. I discuss this in
http://mattmahoney.net/dc/rationale.html

------------------------------------------
Artificial General Intelligence List: AGI
Permalink: 
https://agi.topicbox.com/groups/agi/T409fc28ec41e6e3a-M54f7ee7e2c7294d98b1ecdd4
Delivery options: https://agi.topicbox.com/groups/agi/subscription

Reply via email to