Dear all,

Is this a question about counting? I'm thinking that Ashby noted that Shannon 
information is basically counting. What do we do when we count something?

Analogy is fundamental - how things are seen to be the same may be more 
important than how they are seen to be different. 

It seems that this example of DNA is a case where knowledge advances because 
what was once thought to be the same (for example, perceived empirical 
regularities in genetic analysis) is later identified to be different in 
identifiable ways.

Science has tended to assume that by observing regularities, causes can be 
discursively constructed. But maybe another way of looking at it is to say what 
is discursively constructed are the countable analogies between events. 
Determining analogies constrains perception of what is countable, and by 
extension what we can say about nature; new knowledge changes that perception.

Information theory (Shannon) demands that analogies are made explicit - the 
indices have to be agreed. What do we count? Why x? Why not y? otherwise the 
measurements make no sense. I think this is an insight that Ashby had and why 
he championed Information Theory as analogous to his Law of Requisite Variety 
(incidentally, Keynes's Treatise on Probability contains a similar idea about 
analogy and knowledge). Is there any reason why the "relations of production" 
in a mechanism shouldn't be counted?  determining the analogies is the key 
thing isn't it?

One further point is that determining analogies in theory is different from 
measuring them in practice. Ashby's concept of cybernetics-as-method was: "the 
cyberneticist observes what might have happened but did not". There is a point 
where idealised analogies cannot map onto experience. Then we learn something 
new.

Best wishes,

Mark


-----Original Message-----
From: "Loet Leydesdorff" <l...@leydesdorff.net>
Sent: ‎09/‎06/‎2016 12:52
To: "'John Collier'" <colli...@ukzn.ac.za>; "'Joseph Brenner'" 
<joe.bren...@bluewin.ch>; "'fis'" <fis@listas.unizar.es>
Subject: Re: [Fis] Fw:  "Mechanical Information" in DNA

Dear colleagues, 
 
It seems to me that a definition of information should be compatible with the 
possibility to measure information in bits of information. Bits of information 
are dimensionless and “yet meaningless.” The meaning can be provided by the 
substantive system that is thus measured. For example, semantics can be 
measured using a semantic map; changes in the map can be measured as changes in 
the distributions, for example, of words. One can, for example, study whether 
change in one semantic domain is larger and/or faster than in another. The 
results (expressed in bits, dits or nits of information) can be provided with 
meaning by the substantive theorizing about the domain(s) under study. One may 
wish to call this “meaningful information”. 
 
I am aware that several authors have defined information as a difference that 
makes a difference (McKay, 1969; Bateson, 1973). It seems to me that this is 
“meaningful information”. Information is contained in just a series of 
differences or a distribution. Whether the differences make a difference seems 
to me a matter of statistical testing. Are the differences significant or not? 
If they are significant, they teach us about the (substantive!) systems under 
study, and can thus be provided with meaning in the terms of  studying these 
systems. 
 
Kauffman et al. (2008, at p. 28) define information as “natural selection 
assembling the very constraints on the release of energy that then constitutes 
work and the propagation of organization.” How can one measure this 
information? Can the difference that the differences in it make, be tested for 
their significance? 
 
Varela (1979, p. 266) argued that since the word “information” is derived from 
“in-formare,” the semantics call for the specification of a system of reference 
to be informed. The system of reference provides the information with meaning, 
but the meaning is not in the information which is “yet meaningless”. 
Otherwise, there are as many “informations” as there are systems of reference 
and the use of the word itself becomes a source of confusion.
 
In summary, it seems to me that the achievement of defining information more 
abstractly as measurement in bits (H = - Σ p log(p)) and the availability of 
statistics should not be ignored. From this perspective, information theory can 
be considered as another form of statistics (entropy statistics). A substantive 
definition of information itself is no longer meaningful (and perhaps even 
obscure): the expected information content of a distribution or the information 
contained in the message that an event has happened, can be expressed in bits 
or other measures of information.
 
Best,
Loet
 



Loet Leydesdorff 
Professor, University of Amsterdam
Amsterdam School of Communication Research (ASCoR)
l...@leydesdorff.net ; http://www.leydesdorff.net/ 
Associate Faculty, SPRU, University of Sussex; 
Guest Professor Zhejiang Univ., Hangzhou; Visiting Professor, ISTIC, Beijing;
Visiting Professor, Birkbeck, University of London; 
http://scholar.google.com/citations?user=ych9gNYAAAAJ&hl=en
 
From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of John Collier
Sent: Thursday, June 09, 2016 12:04 PM
To: Joseph Brenner; fis
Subject: Re: [Fis] Fw: "Mechanical Information" in DNA
 
I am inclined to agree with Joseph. That is why I put “mechanical information” 
in shudder quotes in my Subject line.
 
On the other hand, one of the benefits of an information approach is that one 
can add together information (taking care to subtract effects of common 
information – also describable as correlations). So I don’t think that the 
reductionist perspective follows immediately from describing the target 
information in the paper as “mechanical”. “Mechanical”, “mechanism” and similar 
terms can be used (and have been used) to refer to processes that are not 
reducible. “Mechanicism” and “mechanicist” can be used to capture reducible 
dynamics that we get from any conservative system (what I call Hamiltonian 
systems in my papers on the dynamics of emergence – such systems don’t show 
emergent properties except in a trivial sense of being unanticipated). I think 
it is doubtful at best that the mechanical information referred to is 
mechanicist.
 
John Collier
Professor Emeritus and Senior Research Associate
University of KwaZulu-Natal
http://web.ncf.ca/collier
 
From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Joseph Brenner
Sent: Thursday, 09 June 2016 11:10 AM
To: fis <fis@listas.unizar.es>
Subject: [Fis] Fw: "Mechanical Information" in DNA
 
Dear Folks,
 
In my humble opinion, "Mechanical Information" is a contradiction in terms when 
applied to biological processes as described, among others, by Bob L. and his 
colleagues. When applied to isolated DNA, it gives at best a reductionist 
perspective. In the reference cited by Hector, the word 'mechanical' could be 
dropped or replaced by spatial without affecting the meaning.
 
Best,
 
Joseph
 
----- Original Message ----- 
From: Bob Logan 
To: Moisés André Nisenbaum 
Cc: fis 
Sent: Thursday, June 09, 2016 4:04 AM
Subject: Re: [Fis] "Mechanical Information" in DNA
 
Thanks to Moises for the mention of my paper with Stuart Kauffman. If anyone is 
interested in reading it one can find it at the following Web site: 
 
https://www.academia.edu/7835

[The entire original message is not included.]
_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to