I agree with Stan.

The Shannon formula measures "capacity" for information, *not* information
itself. Consider the "snow" pattern on a TV without a signal. Its Shannon
measure is much higher than when a picture appears onscreen, yet we know
that the snow pattern carries no information.

We should begin with entropy, which is the *lack* of constraint. A
system's entropy is a measure of its degrees of freedom to move around.
Please note that entropy is an apophasis (something that does *not*
exist). That's what makes entropy such a difficult concept to grasp.

In contrast, information is present in all forms of constraint, something
that is palpable (apodictic). In communication theory such constraints are
evident in the associations between characters and signals. But constraint
exists beyond the narrow realm of communication, so that the information
in any structure, static or dynamic, can be quantified using the Shannon
calculus. <http://people.clas.ufl.edu/ulan/files/SymmOvhd.pdf> Thus, the
concept of information *transcends* the realm of communication.

So what about the Shannon measure? The distribution used to compute the
Shannon measure can be compared with any reference measure and split into
two components. One component, called the average mutual information,
reveals the amount of constraint between the two distributions, whereas
the second, called the conditional entropy, gauges the freedom that each
distribution has with respect to the other. The two terms sum exactly to
the Shannon measure.

Actually the term "conditional entropy" is redundant, because entropy can
never be calculated without reference to another state. This is called the
"third law of thermodynamics" and it applies to statistical measures just
as much as to physical thermodynamic measures.

Notice, however, that if the calculation of entropy is conditional, then
the measure of information is likewise conditional  (because they sum to
yield the Shannon measure). This conditional nature of information is an
ambiguity that leads to much of our confusion about the meaning of
information. (It keeps FIS discussions lively! :)

The Shannon measure and its decomposed components can all be readily
extended to multiple dimensions and applied to a host of truly complex
events and structures.

Most discussion remains focused on the Shannon measure in abstraction from
all else, which makes the index appear almost meaningless and of limited
utility. The (Bayesian) decomposition of the Shannon formula, however, is
quite rich in what it can reveal, even going so far as to apprehend
proto-meaning. <http://people.clas.ufl.edu/ulan/files/FISPAP.pdf>

The best to all,
Bob

> Entropy
>
> Regarding:
>> So I see it that you confirm to Shannon´s interpretation of entropy as
> actually being information <
> Well, in essence we may agree, but I would call this an unfortunate choice
> of words. “Information," I think, has come to mean so many things to so
> many people that it is *nearly* a useless term. Even though I use this
> term
> myself, I try to minimize its use. I would say that I agree with
> Shannon’s
> view of signal entropy as a *type* of information – and then extend that
> concept using type theory, to include “meaningful” roles. Only when
> taken
> as a whole does “information” exist, within my framing.
>
> S: It has been shocking to me that many info-tech persons use the word
> 'information' when what they mean is Shannon's 'information carrying
> capacity' or the word 'entropy' when they mean Shannon's 'informational
> entropy', referring to variety.
>
> STAN


_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to