It also follows that if the symbol space is binary (0 or 1), then
assuming a fully decorrelated and uniformly distributed sequence bits,
the entropy per symbol (bit) is precisely log2(2) = 1.

>From that, it logically follows that an N bit long decorrelated and
uniform sequence of bits (= "white noise") has N bits of entropy. In
other words, "white noise" has the _maximum_ amount of entropy, since
N bits can be guessed in a maximum of 2^N 'yes/no' guesses, therefore
that is the maximum possible entropy.

It follows, that if we estimate how much a signal looks like "white
noise" (= how much 'decorrelation' or 'randomness' is in it), we can
estimate entropy.
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Reply via email to