> The Hartley entropy
>is invariant to the actual distribution (provided all the
>probabilities are non-zero, and the sample space remains unchanged).

No, the sample space does not require that any probabilities are nonzero.
It's defined up-front, independently of any probability distribution.
Indeed, one of the obvious limitations of Hartley entropy is that it
doesn't recognize "embeddings" of a smaller sample space into a larger one,
where you end up with big zero-probability regions. Shannon entropy
captures that, Hartley "entropy" (it really doesn't deserve the name) just
reminds you how big the larger sample space is. Which you knew at the
outset, since you had to define that in order to get the Hartley "entropy"
to begin with.

>I'm being picky here: Hartley and
>Shannon are clearly not the same formula.

The Hartley entropy formula is a special case of the Shannon entropy
formula, when the probabilities are assumed to be uniform.

>Well, I merely said it's interesting that you can define a measure of
>information without probabilities at all, if desired.

That's a measure of *entropy*, not a measure of information.

If you tried to define mutual information using Hartley entropy it would be
useless: conditioning on another random variable won't change the sample
space (by definition), and so won't reduce the Hartley entropy, and so the
Hartley mutual information must always be zero. Hartley mutual information
between two random variables that are equal is the same as Hartley mutual
information between two independent random variables: zero. It's a
non-starter for any kind of actual information theory.

E

On Tue, Oct 14, 2014 at 10:10 AM, Max Little <max.a.lit...@gmail.com> wrote:

> >>> Hartley entropy doesn't "avoid" any use of probability, it simply
> >>> introduces the assumption that all probabilities are uniform which
> greatly
> >>> simplifies all of the calculations.
> >>
> >>
> >> How so? It's defined as the log cardinality of the sample space. It is
> >> independent of the actual distribution of the random variable.
> >
> >
> > Because as you can see even from Wikipedia, it coincides with Shannon's
> > definition in the case of an equidistributed source.
>
> That's not the point as I see it. I'm being picky here: Hartley and
> Shannon are clearly not the same formula. For any distribution,
> provided the sample space remains unchanged, the Hartley entropy will
> not depend upon the probabilities. On the other hand, the Shannon
> entropy does depend critically upon the probabilities. That's just me
> saying what the formula says, nothing more.
>
> > Thus, Shannon's
> > definition let's you analyze that case, too. Only, Hartley's case doesn't
> > let you analyze but a *very* small part of what the Shannonesque case
> does.
>
> Well, I merely said it's interesting that you can define a measure of
> information without probabilities at all, if desired.
>
> > Since Shannon's is the more general and more generally taught case as
> well,
> > you'd do better to learn it first.
>
> Whatever floats your boat!
>
> > The Hartley one is most likely only
> > mentioned by Kolmogorov as a stepping stone to the full-on min-entropy
> stuff
> > (only fully developed in the recent years, and in very odd niches far
> apart
> > from compression and signaling), and his own computationally minded work
> > (which could still be fully characterized within Shannon's framework, in
> > theory, but in practice goes into the "many separate models" territory I
> > talked about, for computational *complexity* reasons; cf. Greg Chaitin
> too,
> > if you want to really fuck mind up in descriptive complexity ;) ).
>
> I haven't seen Hartley entropy used anywhere practical. More general
> Renyi entropies, yes.
>
> M.
>
> > --
> > Sampo Syreeni, aka decoy - de...@iki.fi, http://decoy.iki.fi/front
> > +358-40-3255353, 025E D175 ABE5 027C 9494 EEB0 E090 8BA9 0509 85C2
> > --
> > dupswapdrop -- the music-dsp mailing list and website:
> > subscription info, FAQ, source code archive, list archive, book reviews,
> dsp
> > links
> > http://music.columbia.edu/cmc/music-dsp
> > http://music.columbia.edu/mailman/listinfo/music-dsp
>
>
>
> --
> Max Little (www.maxlittle.net)
> Wellcome Trust/MIT Fellow and Assistant Professor, Aston University
> TED Fellow (fellows.ted.com/profiles/max-little)
> Visiting Assistant Professor, MIT
> Room MB318A, Aston University
> Aston Triangle, Birmingham, B4 7ET, UK
> UK +44 7710 609564/+44 121 204 5327
> Skype dr.max.little
> --
> dupswapdrop -- the music-dsp mailing list and website:
> subscription info, FAQ, source code archive, list archive, book reviews,
> dsp links
> http://music.columbia.edu/cmc/music-dsp
> http://music.columbia.edu/mailman/listinfo/music-dsp
>
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Reply via email to