On 17/07/2015, Peter S <peter.schoffhau...@gmail.com> wrote:
>
> Think of it as this - if your receiver can distinguish only two
> different sets of parameters, then you need to send at least *one* bit
> to distinguish between them - '0' meaning square wave "A", and '1'
> meaning square wave "B". Without sending at least a *single* bit, your
> receiver cannot distinguish between square waves A and B.

It also follows that when your receiver can distinguish between
precisely two sets of parameters of nonzero probability, then in
practice, the entropy of a square wave (= parameter set) will *always*
be 1 bit. Your transmitter will always need to send *exactly* one bit
to distinguish between square wave "A" and square wave "B",
irregardless of whatever probabilities they have.

So "you need to specify a distribution [...] in order to talk about
the entropy" is false - if I know that the parameters set has a size
of two (meaning two different square waves), then in practice, the
entropy will *always* be 1 bit - your transmitter will send either 0
or 1, meaning square wave "A" or "B, for _all_ possible combination of
probabilities. In this case, the probabilities are entirely irrelevant
- you always send exactly 1 bit.

Hence, if the parameter set has a size of at least two, then you must
always send _at least_ one bit, hence, nonzero entropy, irregardless
of probability distribution. Entropy is zero _only_ if the parameter
set has a size of 1 with probability p=1.

-P
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Reply via email to