Quoting Loet Leydesdorff <l...@leydesdorff.net>:

> Dear Bob,
>
> I now read the book: the description of autocatalysis is very convincing. It
> really clarifies how a "triple helix" (of university-industry-government
> relations) can operate in generating synergy (reducing uncertainty).

Dear Loet,

I'm unsure to which book you are referring? "A Third Window" or "Ecology, the
Ascendent Perspective"?

> You then go on with "Average Mutual Information" (AMI). You don't mention
> that AMI (that most of us call "mutual information") is necessary positive
> since a Shannon-type information, but that "mutual information in more than
> two dimensions" can be negative and thus used as an indicator of synergy or
> autocatalysis.

It's true, I don't refer to the possibility of negative 3-D mutual  
information in any of my books. What Claudia Pahl-Wostl and I agreed  
to call information in the 3-D case was the entire ensemble (1+2+3+4)  
in the attached diagram. While 1 can go negative, the full ensemble  
does not.

Recall that information is always relative, so whether a particular  
component is viewed as information or as conditional entropy is a  
matter of perspective. For example, any of the "ears" 2, 3, or 4 can  
be viewed as a conditional entropy in the context of all three  
variables, or they can be viewed as information in the context of  
their respective binary associations.

Finally, Gennaro Auletta in his book, "Cognitive Biology: Dealing with  
Information from Bacteria to Minds", maintains that complexity is a  
property of more than 2 dimensions and that the magnitude of the  
"ears" is a gauge of such complexity. His definition makes some sense  
in that an "ear" represents how two variables influence one another in  
complete abstraction of variation in the third -- certainly one  
manifestation of complex behavior.

> I usually make reference for this to Ulanowicz (1986, pp. 143 ff.). Is that
> methodologically the same argument? I assume so. Or have you taken these
> measurement issues also further?

That reference is fine. I really haven't taken this issue much  
further. I do remark on p336 of  
<http://people.biology.ufl.edu/ulan/pubs/METHODS.PDF> that the 3-D or  
higher mutual informations can be negative. Somewhere I also remarked  
that a negative mutual information might act like the Pauli-Exclusion  
principle in physics. That is, it would signify an untenable or  
unstable configuration. Unfortunately, I can't find where I wrote that  
speculation. Sorry.

> Best,
> Loet

The best as always,
Bob

_______________________________________________
fis mailing list
fis@listas.unizar.es
https://webmail.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to