On Jan 27, 11:42 am, John Clark <johnkcl...@gmail.com> wrote:
> On Thu, Jan 26, 2012  Craig Weinberg <whatsons...@gmail.com> wrote:
>
> > If a bucket of water has more of it than DNA, then the word information
> > is meaningless.
>
> You would need to send more, far far more, dots and dashes down a wire to
> inform a intelligent entity what the position and velocity of every
> molecule in bucket of water is than to inform it exactly what the human
> genome is.

It depends what kind of compression you are using. You could much more
easily write a probabilistic equation to simulate any given volume of
water than the same volume of DNA, especially when you get into
secondary and tertiary structure.

> Now what word didn't you understand.

Condescension doesn't impress me. I understand your words perfectly,
it's just that what they are saying seems to be incorrect.

>
> > > A symphony then would have less information and more entropy than random
> > noise.
>
> No, a symphony would have less information but LESS entropy than random
> white noise.

Ok, I think I see what the confusion is. We are operating not only
different definitions of entropy but different assumptions about the
universe which directly relate to information.

This Q&A: 
http://stackoverflow.com/questions/651135/shannons-entropy-formula-help-my-confusion
was the only page I could find that was written simply enough to make
sense to me. Your definition assumes that the universe is a platform
for encoding and decoding information and mine does not. You are
talking about entropy in terms of resistance to compression of
redundancy. Ok, but the relationship of Shannon entropy and
thermodynamic entropy is not what you are implying it is. The Wiki was
helpful: http://en.wikipedia.org/wiki/Entropy_(information_theory)

"At an everyday practical level the links between information entropy
and thermodynamic entropy are not evident. Physicists and chemists are
apt to be more interested in changes in entropy as a system
spontaneously evolves away from its initial conditions, in accordance
with the second law of thermodynamics, rather than an unchanging
probability distribution. And, as the minuteness of Boltzmann's
constant kB indicates, the changes in S / kB for even tiny amounts of
substances in chemical and physical processes represent amounts of
entropy which are so large as to be off the scale compared to anything
seen in data compression or signal processing. Furthermore, in
classical thermodynamics the entropy is defined in terms of
macroscopic measurements and makes no reference to any probability
distribution, which is central to the definition of information
entropy.

But, at a multidisciplinary level, connections can be made between
thermodynamic and informational entropy, although it took many years
in the development of the theories of statistical mechanics and
information theory to make the relationship fully apparent. In fact,
in the view of Jaynes (1957), thermodynamic entropy, as explained by
statistical mechanics, should be seen as an application of Shannon's
information theory: the thermodynamic entropy is interpreted as being
proportional to the amount of further Shannon information needed to
define the detailed microscopic state of the system, that remains
uncommunicated by a description solely in terms of the macroscopic
variables of classical thermodynamics, with the constant of
proportionality being just the Boltzmann constant."

The key phrase for me here is "the thermodynamic entropy is
interpreted as being proportional to the amount of further Shannon
information needed to define the detailed microscopic state of the
system". This confirms what I have been saying and is the opposite of
what you are saying. Thermodynamic entropy is proportional to the
amount of Shannon information *needed* to (encode/compress/extract
redundancy) from a given description to arrive at a maximally
compressed description. The more entropy or patternlessness you have,
ie the more equilibrium of probability and lack of redundancy you
have, the less information you have and the more Shannon information
you need to avoid lossy compression.

This means that DNA, having low entropy compared with pure water, has
high pattern content, high information, and less Shannon information
is required to describe it. Easier to compress does *not* mean less
information, it means more information is present already because in
essence the job is already partially done for you. Shannon entropy
then, is a measure of drag on compression, a figurative use of the
term entropy for the specific purpose of encoding and decoding. I am
using the literal thermodynamic sense of entropy, as well as the
figurative vernacular sense of entropy as degradation of order or
coherence, both of these are loosely inversely proportional to Shannon
entropy. The compressibility of a novel or picture does not relate to
the quality of information, not to mention qualities of significance.
Weighing art by the pound is not a serious way to approach a theory
about consciousness or qualia.


>That's why lossless computer image and sound compression
> programs don't work with white noise, there is no redundancy to remove
> because white noise has no redundancy.  It would take many more dots and
> dashes sent down a wire to describe every pop and click in a piece of white
> noise than to describe a symphony of equal length.

Yes, I see what you mean. I had not heard of Shannon information. It's
an excellent tool for working with statistical data, but tells us
nothing about what information actually is or does. It is an analysis
of how to engineer data quantitatively, and as such, it
(appropriately) takes data for granted. I don't do that.

>
> > If the word information is to have any meaning, quantity and
> > compressibility of data must be distinguished from quality of it's
> > interpretation.
>
> If you want to clearly distinguish these things, and I agree that is a very
> good idea, then you need separate words for the separate ideas. Quality is
> subjective so mathematics can not deal with it, mathematics can work with
> quantity however, so if quality comes into play you can not use the word
> "information" because mathematics already owns that word; but there are
> plenty of other words that you can use, words like "knowledge" or
> "wisdom".

Yes, I agree, that's why I make such a big deal about not reaching for
that term to talk about perception. Perception is all about quality.
Knowledge and wisdom are already owned by philosophy and religion,
that's why I use sensorimotive awareness, perception, cognition,
feeling, sensing, etc.

>
> > Let's say your definition were true though. What does it have to do with
> > information being directly proportionate to entropy?
>
> The larger the entropy something has the more information it has.

Yes, this was a semantic confusion. I don't understand why you would
use the general term information to describe Shannon information, but
I at least understand what you mean now. Shannon information may be
the 'only reasonable way to measure information' but it is not
information and it does not map to information. It's like measuring a
volume of water in terms of kWH of power it can generate. Yes, it is a
way of measuring an effect of a quantity of water but it is not a
direct measurement of any quantity or quality of water itself. It
should be considered also that the Shannon approach is only valid
because it's the best we have come up with so far. The human mind does
not work like a computer - it does not compress and decode memories
that way at all. The psyche stores experiences as iconic associations;
semantic triggers for sensorimotive cascades which are concrete analog
presentations that re-present, *not* representations and not digital
data.

>
> > If entropy were equal or proportionate to information, then are saying
> > that the more information something contains, the less it matters.
>
> Whether it matters or not is subjective so you should not use the word
> "information" in the above.

No. You should not use the word information when you are talking about
a special case statistical definition of the word. I don't think it's
an exaggeration to say that 99% of people who use the word information
use it the way that I've been using it, not in the way you have been
using it.

> A bucket of water contains far more information
> than the human genome but the human genome has far more knowledge, at least
> I think so, although a bucket of water might disagree with me.

No. The bucket of water has higher thermodynamic entropy which
requires more Shannon information to describe. The encoded description
of the water has more information if we were to simulate it exactly,
but that doesn't mean the original has more information, it just means
it's got more noise (less signal).

Craig

-- 
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to