If you assume the entropy _rate_ to be the average entropy per bits
(although there are other possible definitions), then - if the total
entropy of a waveform is finite, then as the number of observations
approaches infinity, the entropy rate will approach zero, as
finite/infinite = 0.

This does *not* mean that the _entropy_ of the signal is zero. All
signals have *some* amount of entropy, some finite, some infinite.
Even a constant signal has nonzero entropy, as you need to transmit
the constant value at least *once* to be able to reconstruct it.
(Well, maybe an infinite constant zero signal has zero entropy.)

So do *not* confuse 'entropy' and 'entropy rate' - they're different
measures. Zero entropy _rate_ does *not* mean that the total entropy
is zero. It just means, the total entropy is finite, so if you make
infinite observations, then the *average* entropy will converge to
zero.

When estimating 'entropy rate' (average entropy per bit), then the
result will never actually reach zero in a finite amount of time. Also
the amount of total entropy in the signal will affect how fast the
"measured" entropy rate converges to zero, even if it has only a
finite amount of entropy, and thus an asymptotic entropy rate of zero.
In estimates, this causes the measured entropy rate of complex
periodic waveforms to converge slower, because the complex shape of
the waveform has more entropy.

-P
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Reply via email to