At 01:36 PM 14/10/2007, Michel Petitjean wrote:
To: fis@listas.unizar.es
Subject: [Fis] Re: info & meaning

bob logan <[EMAIL PROTECTED]> wrote:
> Loet et al - I guess I am not convinced that information and entropy
> are connected. Entropy in physics has the dimension of energy divided
> by temperature. Shannon entropy has no physical dimension - it is
> missing the Boltzman constant. Therefore how can entropy and shannon
> entropy be compared yet alone connected?

As indicated on http://en.wikipedia.org/wiki/Shannon_entropy
about the relationship of information entropy to thermodynamic entropy:
<< The inspiration for adopting the word entropy in information theory
came from the close resemblance between Shannon's formula and very similar
known formulae from thermodynamics. >>
In other words, the link between information entropy to thermodynamic entropy
is just a formal analogy between equations in two mathematical models:
one of these models is applied to physical science, the other is applied
to communication science.
I cannot see more connection between these entropies,
but may be some other people do.

The connection is spelled out in some detail in Leon Brillouin,
Science and Information Theory, 2nd edition 1962, Academic
Press. I believe there is an edition still in print. Also see the
book by my student Scott Muller, that I mentioned in my second to
last post. Information, as Schroedinger made clear, is a measure
of negative entropy (complement of entropy, Hmax - Hact) if
you pay attention to physical embodiment. It is not entropy, but
closely related. Of course if you ignore physical embodiment, then
all bets are off, but in that case I, personally, would not know
what you are talking about.

Talking a bout information, or meaning for that matter, without
respect for its embodiment is basically bullshit (or it is mathematics,
which is OK, but not informative, as it is all tautological). So
let's dump the non-precise tautologies and start talking about
something! Shannon's work gave us a useful mathematical
tool for discussing the capacity of a channel. Barwise and
Seligman have more recently given a useful mathematical
tool for describing the properties of a channel. Kolmogorov
and Chaitin have given use useful mathematical tools for
connecting information to logic and probability. Ingarden
has given more abstract tools for combining the various
ideas and grounding probability in information and logic.
Shannon never even told us what probability is, though his
approach works on any reasonable account. None of these
mathematical tools in themselves can give us a scientific
theory of information, however, since they are all tautological.
At best they give more or less compatible formal definitions.
But to know what information is we need an interpretation
that fits all of these approaches (or at least gives a good
reason why it doesn't and shouldn't).

Formal theories tell us nothing. They are tautological,
as are all formal definitions. In particular, they cannot tell
us how their interpretations in the real world are connected
or if they are. There is no a priori reason why two models
of Shannon's theory have anything to do with each other
beyond formal analogue, let alone whether information
theory and statistical mechanics have anything to
do with each other because they share formal aspects.
Mathematics is incapable by its nature of telling us
if any two things are connected in any substantial
way, so independence of development, or common
development, can't tell us anything one way or the
other about the connections in the world.

In this line, for example, the Barwise an Seligman
give us a formalism that describes an information
channel. However, since causal connection
cannot be formalised, they use regularity as a
substitute. But some regularities are coincidental,
and others are epiphenomenal -- neither of these
can bear information, despite satisfying the
B&S formalism. The same is true of Shannon's
formalism, incidentally. We must pay attention
to how things are embodied in order to get applications
of either formalism right. Fortunately this is not
so hard. But then we can't just ignore this aspect
of the applications of formalisms when we look at
how different formalism are related to each other,
especially if we do anything other than claim
agnosticism.

It is possible to apply Shannon's communication
theory in ways that are not compatible with
statistical mechanics. It is also possible,
however, to apply Shannon's theory in ways
that are not compatible with each other (in
the same system, nonetheless). This tells us
nothing about how we should apply these theories
or whether they are connected to each other.
They are tautologies. They don't come with
intended applications. These have to be discovered,
not predefined. It is a substantive, empirical issue.
It can't be decided on the basis of knowledge of
independent applications compiled in the Wikipedia
or anywhere else. We have to look at cases where the
ideas are applied in the same way to the same
systems.

The ideas are applied in the book Loet mentioned, Brooks and
Wiley, Evolution as entropy, 2nd edition, 1988, Chicago. They
have also been applied by David Layzer (see his 1990 book
Cosmogenesis, and a Scientific American article in about
1986), Peter Landsberg, and PCW Davies, among others,
from a conference about 1981. See also work by Murray
Gell-Mann, Wheeler, and others associated with the Santa
Fe Institute. This is not new work, and it has been done
by scientists of no small rank. It is there to see, and not
hard to understand if you look. There are some technical
problems still to work out, but I know of no fundamental
problems that have been found with the idea that information
and entropy not merely analogous, but aspects of the same
thing. This is one of the great scientific discoveries of the last
century. We need to modify our understanding of information
to take it into account. If and when we have done that, perhaps
we can start to say something sensible about the relation
between information and meaning. Otherwise, we will be
building castles without foundations. They may be
impregnable, but they aren't very useful.

I think that the integration of meaning with the rest of
science will be one of the big achievements of this
century, and that information theory will be central
to this (along with a good theory of signs that is
also grounded in the physical world in its functional
states -- we also need to understand function as a physical
process to do this). However, I despair that our current
understanding is blocking us because we are projecting
it as our future understanding, when it has proven
inadequate so far, and I see little reason to think that
all of a sudden the world will change to accommodate
our preconceptions. The answer is out there, not in
us or our old ideas. Let us try to uncover it.

But I feel like we are running around in  circles here faster and
faster. Soon we are going to fly up our own wazoos and
disappear.

Feeling very disgruntled,

John


----------
Professor John Collier                                     [EMAIL PROTECTED]
Philosophy and Ethics, University of KwaZulu-Natal, Durban 4041 South Africa
T: +27 (31) 260 3248 / 260 2292       F: +27 (31) 260 3031
http://www.ukzn.ac.za/undphil/collier/index.html

_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis

Reply via email to