So, since Theo Verelst still doesn't understand, let's discuss it
again the Nth time:

Shannon defines entropy as the minimum average length of a message in
bits when you want to transmit messages of a probabilistic ensemble.
He dismisses the codebook size and the size of the decoder/receiver
program, assuming them to be zero.

Since without the codebook (either transmitted or pre-agreed), the
messages cannot be decoded, and your transmission is entirely useless.
Therefore, I consider the codebook the integral part of the
transmission, hence its size is always nonzero (either transmitted or
pre-agreed), therefore the actual entropy is always nonzero, even in
the case of a single message with 100% probability.

(I gave some formulas earlier about this.)

The algorithmic information theory folks (Kolmogorov, Solomonoff,
Chaitin, etc.) define entropy in relation to some sort of machine,
typically an abstract Turing machine. Once an eval function is defined
for that machine, the precise algorithmic entropy can be defined as
the minimum length binary program that outputs that string of bits.
This is also always nonzero, and is not a function of probability.
(See Kolmogorov complexity / algorithmic information theory.)

Entropy _rate_ is the measure of average information per symbol as the
number of symbols asymptotically reaches infinity. There's a problem
with this - when was the last time you made an infinite amount of
observations? Since your lifetime is finite, you can never process an
infinite amount of data, so in practice, dividing the total entropy by
the number of symbols will never actually reach zero. Since the actual
entropy of a signal is always nonzero, the only way the measured
entropy _rate_ of a signal can be zero is by making an infinite amount
of observations, since finite/infinite = zero. So that can happen only
in theory, but never in practice.

And since all this was already discussed, what I still don't
understand: Why didn't you guys paid attention the first time? Why do
I need to repeat everything 3-4-5 times?

-P
_______________________________________________
music-dsp mailing list
music-dsp@music.columbia.edu
https://lists.columbia.edu/mailman/listinfo/music-dsp

Reply via email to