Dear Mark,

Thank you for your interest in my FIS paper!
<https://people.clas.ufl.edu/ulan/files/FISPAP.pdf>

I didn't intend by it to infer that Shannon-class measures were the
ultimate tool for information science, only to argue against prematurely
rejecting that thrust entirely -- as so many do. By looking at Bayesian
forms of the Shannon measure we can address information per-se (and even a
form of proto-meaning)and achieve a measure of what is missing. This
latter advantage opens up another dimension to science. (The apophatic had
been implicitly addressed by thermodynamic entropy, which has hardly ever
been recognized as an apophasis. That's why entropy remains so confusing
to so many!)

The Achilles tendon of Shannon-like measures lies in the underlying
assumption of distinct categories with which to describe the
distributions. The boundaries between categories are often "fuzzy", and,
as you point out, they change with time and growth.

I have been told that mutual information(s) has been defined over fuzzy
sets, but I confess I haven't investigated the advantages of this
extension. As for changing numbers of categories, I note that mutual
information remains well-defined even when the numbers of categories in
the sets being compared are not the same. So I would encourage your
exploration with musical forms.

As to Ashby's metaphor of a hemostat as a machine, my personal preference
is to restrict mechanical analogs for living systems to only those that
are unavoidable. I feel the language of mechanics and mechanisms is
*vastly* overused in biology and draws our attention away from the true
nature of biotic systems.

Thank you for your challenging and astute questions!

Cheers,
Bob

> Dear Bob,
>
> In your Shannon Exonerata paper you have an example of three strings,
> their entropies and their mutual information. I very much admire this
> paper and particularly the critique  of Shannon and the emphasis on the
> apophatic, but some things puzzle me. If these are strings of a living
> thing, then we can assume that these strings grow over time. If sequences
> A,B and C are related, then the growth of one is dependent on the growth
> of the other. This process occurs in time. During the growth of the
> strings, even the determination of what is and is not surprising changes
> with the distinction between what is seen to be the same and what isn't.
>
>  I have begun to think that it's the relative entropy between growing
> things (whether biological measurements, lines of musical counterpoint,
> learning) that matters. Particularly as mutual information is a variety
> of relative entropy. There are dynamics in the interactions. A change in
> entropy for one string with no change in entropy in the others (melody
> and accompaniment) is distinct from everything changing at the same time
> (that's "death and transfiguration"!).
>
> Shannon's formula isn't good at measuring change in entropy. It's less
> good with changes in distinctions which occur at critical moments ("aha! A
> discovery!" Or "this is no longer surprising") The best that we might do,
> I've thought, is segment your strings over time and examine relative
> entropies. I've done this with music. Does anyone have any other
> techniques?
>
> On the apophatic, I can imagine a study of the dynamics of Ashby's
> homeostat where each unit produced one of your strings. The machine comes
> to its solution when the entropies of the dials are each 0 (redundancy 1)
> As the machine approaches its equilibrium, the constraint of each dial on
> every other can be explored by the relative entropies between the dials.
> If we wanted the machine to keep on searching and not settle, it's
> conceivable that you might add more dials into the mechanism as its
> relative entropy started to approach 0. What would this do? It would
> maintain a counterpoint in the relative entropies within the ensemble.
> Would adding the dial increase the apophasis? Or the entropy? Or the
> relative entropy?
>
> Best wishes,
>
> Mark


_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to