Loet et al - I guess I am not convinced that information and entropy
are connected. Entropy in physics has the dimension of energy divided
by temperature. Shannon entropy has no physical dimension - it is
missing the Boltzman constant. Therefore how can entropy and shannon
entropy be compared yet alone connected?
I am talking about information not entropy - an organized collection
of organic chemicals must have more meaningful info than an
unorganized collection of the same chemicals.
On 11-Oct-07, at 5:34 PM, Loet Leydesdorff wrote:
Loet - if your claim is true then how do you explain that a random
soup of
organic chemicals have more Shannon info than an equal number of
organic
chemicals organized as a living cell where knowledge of some
chemicals
automatically implies the presence of others and hence have less
surprise
than those of the soup of random organic chemicals? - Bob
Dear Bob and colleagues,
In the case of the random soup of organic chemicals, the maximum
entropy of the systems is set by the number of chemicals involved (N).
The maximum entropy is therefore log(N). (Because of the randomness of
the soup , the Shannon entropy will not be much lower.)
If a grouping variable with M categories is added the maximum entropy
is log(N * M). Ceteris paribus, the redundancy in the system increases
and the Shannon entropy can be expected to decrease.
In class, I sometimes use the example of comparing Calcutta with New
York in terms of sustainability. Both have a similar number of
inhabitants, but the organization of New York is more complex to the
extent that the value of the grouping variables (the systems of
communication) becomes more important than the grouped variable (N).
When M is extended to M+1, N possibilities are added.
I hope that this is convincing or provoking your next reaction.
Best wishes,
Loet
_______________________________________________
fis mailing list
fis@listas.unizar.es
http://webmail.unizar.es/mailman/listinfo/fis