>How so? It's defined as the log cardinality of the sample space. It is
>independent of the actual distribution of the random variable.

Right, and that is exactly equivalent to using Shannon entropy under the
assumption that the distribution is uniform. That's why it's also called
"maxentropy," since a uniform distribution maximizes Shannon entropy. Every
result you can get with Hartley entropy can be immediately restated with
Shannon entropy, simply by adding a reminder that all distributions are
assumed to be uniform. It's really nothing more than a short-hand for
"Shannon entropy in the case of uniform distributions," which is presumably
why I've never seen anybody actually use it for anything. Hartley entropy
is unable to cope with infinite sets, and so is useless in many of the
interesting settings that Shannon entropy covers. Yes, you can
philosophically assert "but we don't even have to define a probability
distribution at all to come up with Hartley entropy!" but so what? It
doesn't work in any setting where it isn't trivial to define a probability
distribution (and then assume it to be uniform) in the first place. So you
get all of the available results - and a whole lot more - by sticking to
Shannon entropy, and introducing assumptions/estimates of the distribution
as appropriate.

Really, one doesn't even need the concept of "entropy" or "randomness" to
express any of the results you can get with Hartley entropy (for example,
that representing the sum of two 16 bit variables generally requires 17
bits). Seems to me that it's just a bunch of fancy verbiage piled on top of
elementary properties of finite sets.

E

On Tue, Oct 14, 2014 at 9:18 AM, Max Little <max.a.lit...@gmail.com> wrote:

> > Hartley entropy doesn't "avoid" any use of probability, it simply
> > introduces the assumption that all probabilities are uniform which
> greatly
> > simplifies all of the calculations.
>
> How so? It's defined as the log cardinality of the sample space. It is
> independent of the actual distribution of the random variable.
>
> --
> Max Little (www.maxlittle.net)
> Wellcome Trust/MIT Fellow and Assistant Professor, Aston University
> TED Fellow (fellows.ted.com/profiles/max-little)
> Visiting Assistant Professor, MIT
> Room MB318A, Aston University
> Aston Triangle, Birmingham, B4 7ET, UK
> UK +44 7710 609564/+44 121 204 5327
> Skype dr.max.little
> --
> dupswapdrop -- the music-dsp mailing list and website:
> subscription info, FAQ, source code archive, list archive, book reviews,
> dsp links
> http://music.columbia.edu/cmc/music-dsp
> http://music.columbia.edu/mailman/listinfo/music-dsp
>
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Reply via email to