Let me express the Hartley entropy another way:

The Hartley entropy gives the size of the symbol space, so it is a
good approximator and upper bound for the actual entropy. If the
symbols are fully decorrelated, then the _maximum_ amount of time it
takes to search through the entire symbol space is determined by the
size of the symbol space. So the Hartley entropy gives the _maximum_
amount of resource (time, money etc.) needed to search through a given
symbol space fully, which is often more important than the _actual_
amount of resource it takes. So it is an 'upper bound'.

Usually, it will not take that long to search through the symbol space
fully, since this is an upper bound. But we can know that if the
symbols are decorrelated and equidistributed, then *on average*, it
will take _half_ the time/resource to find the solution, compared to
the upper bound.

We could call this *expected* entropy, which is the half of the upper
bound given by the size of the symbol space, since *on average*, we
only need half the time to find the actual solution. So the *expected*
entropy is 1 bit less than the Hartley entropy given by the size of
the symbol space.

Often we don't care how much entropy is there *precisely*, we just
want an upper bound or an expected average entropy that correlates
directly with the size of the symbol space.
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Reply via email to