On 11/10/2014, r...@audioimagination.com <r...@audioimagination.com> wrote:
> all "decompression" is is decoding.  you have tokens (usually binary bits or
> a collection of bits) and a code book (this is something that you need to
> understand regarding Huffman or entropy coding), you take the token and look
> it up in the code book and, from that, there
> is a message (sometimes called an "event").

Thanks for the explanation. What I meant, that there is also entropy
(or as you call it, 'natural information') both in the very algorithm
itself that does the 'decoding', and I guess also in the 'code book'
(though I'll have to look into that part). See my point? Do programs
and algorithms themselves not contain 'natural information'?

What I meant to say - the program that decodes the information, is
also information in itself. You need to have a copy of it somewhere if
you want to do the decoding... implying that the information contained
in the 'decoder' is also required to decode the (encoded) information.
Without the 'decoder' (and the information contained in the algorithm
of the decoder), you simply cannot decode the information... This is
what I meant by 'implied' entropy, as the decoder implies further
entropy in the system (in forms of algorithms), not only the encoded
original information.

The arrogance of academics always amuses me... It's irrelevant to be
nitpicky about details like saying entropy is a property of *messages*
and not bits. What happens to a stream of bits when I send it to you?
At that moment, it instantly becomes a 'message'. And what is a
'message' (assuming digital communication)? As far as I know, it is
_always_ 'bits'. So this is one of those irrelevant details that could
be called 'minor formalisms' *at max*, and whether we call it 'bits',
'message', 'communication' or 'signal' is totally irrelevant, as they
mean the same thing in this context.

Let me quote a paragraph from Wikipedia:

"In the case of transmitted messages, these probabilities were the
probabilities that a particular message was actually transmitted, and
the entropy of the message system was a measure of the average amount
of information in a message. For the case of equal probabilities (i.e.
each message is equally probable), the Shannon entropy (in bits) is
just the number of yes/no questions needed to determine the content of
the message.[18]"

Let me emphasize that the situation I'm speaking about, is the
simplest case, single message, equal probability, and _all_ I want to
find out is the entropy content of a _single_ message. Which the above
paragraph defines as:

"For the case of equal probabilities [...], the Shannon entropy (in
bits) is just the number of yes/no questions needed to determine the
content of the message."

If for some reason, all you can think about is messages and
probabilities and probability distributions, then you'll clearly fail
to see my point...

Quote from Shannon:

"I thought of calling it 'information', but the word was overly used,
so I decided to call it 'uncertainty'. [...] Von Neumann told me, 'You
should call it entropy, for two reasons. In the first place your
uncertainty function has been used in statistical mechanics under that
name, so it already has a name. In the second place, and more
important, nobody knows what entropy really is, so in a debate you
will always have the advantage.'"

> that's what entropy is, Peter.  it's not about transitions of 0 to 1,

Absolutely incorrect. If your mind is poisoned by academic books so
much that you're unable to think 'outside the box' anymore, and all
you can do is argue about irrelevant and minor formalisms, then you'll
clearly fail to see what I'm talking about.

If we agree that a message _does_ indeed have entropy, then it also
clearly follows that the entropy_has_ to be _somewhere_ physically in
the stream of bits. Very simply because, on the physical reality
level, literally _everything_ is expressed as bits, as far as digital
communication is concerned. So it logically follows, that - since on
the physical level _everything_ is represented as 'bits' - it clearly
follows that 'entropy' also _has_ to be represented on the physical
level as 'bits'. There is literally no other way it could be
represented in a digital communication, since there is no other way
information could be transmitted and represented other than, via
'bits'.

Let me emphasize that I'm talking about _representation_ here, and
again, it's the simplest case, single message, equal probability, and
all I am trying to find out is the 'information' (entropy) content in
a _single_ message. Simplest case, single message, equal
probabilities, no a-priori knowledge.

What I'm trying to find out is:
- What is the "entropy distribution" (information distribution) of the message?
- Where _exactly_ is the entropy (information) located in the message?
- Could that entropy be extracted or estimated somehow?

Which is on some level analoguous to 'entropy extraction', which is a
standard, practical procedure in digital computing, known for decades
since von Neumann. If you think that I am wrong, then it would
logically follow that von Neumann was also wrong.

Again, if we agree that a message has entropy, and we agree that a
message consists of bits, then it clearly and logically follows that
unless the message consists of the _same_ bits, it clearly _has_ to
have an 'entropy distrubition' or 'information distribution' - some
bits in the message encoding (representing) more entropy
(information), and some other bits in the message encoding
(representing) less entropy (information). And the sum of the entropy
represented in each, individual bit will give the total entropy of the
whole message. I think this is very logical and clear. Where is the
contradiction?

Again, you can be condescending and argue about irrelevant minor
formalisms all day long, but that is entirely pointless and won't lead
anywhere...
--
dupswapdrop -- the music-dsp mailing list and website:
subscription info, FAQ, source code archive, list archive, book reviews, dsp 
links
http://music.columbia.edu/cmc/music-dsp
http://music.columbia.edu/mailman/listinfo/music-dsp

Reply via email to