Re: [Fis] A provocative issue

2016-12-12 Thread Rafael Capurro

Dear Pedro,

your wise comments are no less inspiring for me than the last comments 
by Loet, namely that the concept of information is 'relative' to each 
context (and therefore 'objective' with regard to such context). This 
happens with other concepts too, of course. I said this in my PhD 
(1978). Then Fleissner and Hofkirchner created what they called 
"Capurro's trilemma" that is well known in this list. Loet's remarks on 
the contextual relativity is a solution (better: a dissolution) of the 
trilemma. But this does not mean that we might like to make analogies, 
equivocies and univocities from time to time that can be also useful or 
inspiring.


The oriental thinkers you refer to, remember me what I discovered during 
and after out last meeting in Vienna, that helped me to question my own 
presuppositions (blindness) concerning the 'metaphysical' concept of 
'in-formatio' I analized in 1978. I prefer not to quote this in extenso 
but just raise you attention (maybe your curiosity) to pp. 8 ss of this 
paper: http://www.capurro.de/icil2016.pdf


I still think that the concepts of message/messenger and translation 
(trans-lation, tra-ducción, Über-setzen) (DNA etc.) is a key issue for 
biology (and not only for biology). I say this without all your 
knowledge regarding biology.


If after twenty years (I remember our meeting in Vienna in 1995) 
discussing this concept we can say that at least we were able to 
question again  and again our presuppositions and this is the spirit of 
science (and philosophy).


best

Rafael





Dear Arturo and FISers

We will forgive your life!  Some other people in this list also 
have strong reservations to a single, canonical approach to 
information, either from Shannon, Boltzmann, or Fisher backgrounds. 
In my case, mostly biologically and socially grounded 
("sociotype"), I see a complex panorama of biological information, 
mostly "relative" concerning communication with the environment 
(via cellular signaling systems), but sort of "objective" 
concerning the inner self-production processes (shared DNA and 
genetic/translation and many other codes). But both are elegantly 
intertwined in the advancement of a life cycle. My hunch is that 
this type of relative/objective duality culminating in existential 
maintenance has some generality and could be "exported" to physics 
too. It is curious that some oriental thinkers (Chu-Hsi or Zhu Xi, 
9 Centuries ago) had already advanced sort of similar ideas... 
Well, above all, info is "paradoxical" and has kept all of us 
amused in this list, at least for the past twenty years!


Best--Pedro
PS. Let me remind that you have not answered yet my topo-evo 
comments in the earliest messages. That point is, I think, very 
important in the present discussion.


El 11/12/2016 a las 16:57, tozziart...@libero.it escribió:



Dear FISers,

I know that some of you are going to kill me, but there’s 
something that I must confess.


I notice, from the nice issued raised by Francesco Rizzo, Joseph 
Brenner, John Collier, that the main concerns are always 
energetic/informational arguments and accounts.


Indeed, the current tenets state that all is information, 
information being a real quantity that can be measured through 
informational entropies.


But… I ask to myself, is such a tenet true?

When I cook the pasta, I realize that, by my point of view, the 
cooked pasta encompasses more information than the not-cooked one, 
because it acquires the role of something that I can eat in order 
to increase my possibility to preserve myself in the hostile 
environment that wants to destroy me.  However, by the point of 
view of the bug who eats the non-cooked pasta, my cooked pasta 
displays less information for sure.  Therefore, information is a 
very subjective measure that, apart from its relationship with the 
observer, does not mean very much…  Who can state that an event or 
a fact displays more information than another one?


And, please, do not counteract that information is a quantifiable, 
objective reality, because it can be measured through 
informational entropy… Informational entropy, in its original 
Shannon’s formulation, stands for an ergodic process (page 8 of 
the original 1948 Shannon’s seminal paper), i.e.: every sequence 
produced by the processes is the same in statistical properties, 
or, in other words, a traveling particle always crosses all the 
points of its phase space.  However, in physics and biology, the 
facts and events are never ergodic.  Statistical homogeneity is 
just a fiction, if we evaluate the world around us and our 
brain/mind.


Therefore, the role of information could not be as fundamental as 
currently believed.


P.S.: topology analyzes information by another point of view, but 
it’s an issue for the next time, I think…



*Arturo Tozzi*

AA Professor Physics, University North Texas

Pediatrician ASL Na2Nord, Italy

Comput Intell Lab, University Manitoba

http://arturotozzi.webnode.it/





Re: [Fis] A provocative issue

2016-12-12 Thread Pedro C. Marijuan

Dear Arturo and FISers

We will forgive your life!  Some other people in this list also have 
strong reservations to a single, canonical approach to information, 
either from Shannon, Boltzmann, or Fisher backgrounds. In my case, 
mostly biologically and socially grounded ("sociotype"), I see a complex 
panorama of biological information, mostly "relative" concerning 
communication with the environment (via cellular signaling systems), but 
sort of "objective" concerning the inner self-production processes 
(shared DNA and genetic/translation and many other codes). But both are 
elegantly intertwined in the advancement of a life cycle. My hunch is 
that this type of relative/objective duality culminating in existential 
maintenance has some generality and could be "exported" to physics too. 
It is curious that some oriental thinkers (Chu-Hsi or Zhu Xi, 9 
Centuries ago) had already advanced sort of similar ideas... Well, above 
all, info is "paradoxical" and has kept all of us amused in this list, 
at least for the past twenty years!


Best--Pedro
PS. Let me remind that you have not answered yet my topo-evo comments in 
the earliest messages. That point is, I think, very important in the 
present discussion.


El 11/12/2016 a las 16:57, tozziart...@libero.it escribió:



Dear FISers,

I know that some of you are going to kill me, but there’s something 
that I must confess.


I notice, from the nice issued raised by Francesco Rizzo, Joseph 
Brenner, John Collier, that the main concerns are always 
energetic/informational arguments and accounts.


Indeed, the current tenets state that all is information, information 
being a real quantity that can be measured through informational 
entropies.


But… I ask to myself, is such a tenet true?

When I cook the pasta, I realize that, by my point of view, the cooked 
pasta encompasses more information than the not-cooked one, because it 
acquires the role of something that I can eat in order to increase my 
possibility to preserve myself in the hostile environment that wants 
to destroy me.  However, by the point of view of the bug who eats the 
non-cooked pasta, my cooked pasta displays less information for sure.  
Therefore, information is a very subjective measure that, apart from 
its relationship with the observer, does not mean very much…  Who can 
state that an event or a fact displays more information than another one?


And, please, do not counteract that information is a quantifiable, 
objective reality, because it can be measured through informational 
entropy… Informational entropy, in its original Shannon’s formulation, 
stands for an ergodic process (page 8 of the original 1948 Shannon’s 
seminal paper), i.e.: every sequence produced by the processes is the 
same in statistical properties, or, in other words, a traveling 
particle always crosses all the points of its phase space.  However, 
in physics and biology, the facts and events are never ergodic. 
 Statistical homogeneity is just a fiction, if we evaluate the world 
around us and our brain/mind.


Therefore, the role of information could not be as fundamental as 
currently believed.


P.S.: topology analyzes information by another point of view, but it’s 
an issue for the next time, I think…



*Arturo Tozzi*

AA Professor Physics, University North Texas

Pediatrician ASL Na2Nord, Italy

Comput Intell Lab, University Manitoba

http://arturotozzi.webnode.it/




___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis



--
-
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta 0
50009 Zaragoza, Spain
Tfno. +34 976 71 3526 (& 6818)
pcmarijuan.i...@aragon.es
http://sites.google.com/site/pedrocmarijuan/
-

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] A provocative issue

2016-12-11 Thread John Collier
Shannon declared in his original book that constraints are information. I don’t 
get the distinction you are trying to make. Also, Shannon information applies 
to continuous systems. If they have a form (are constrained), then they have 
finite information. Infinite information applies only if there are no 
constraints. I don’t see how that could be true in a world that has 
regularities.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of Bob Logan
Sent: Sunday, 11 December 2016 10:21 PM
To: tozziart...@libero.it
Cc: fis <fis@listas.unizar.es>
Subject: Re: [Fis] A provocative issue

Bravo Arturo - I totally agree - in a paper I co-authored with Stuart Kauffman 
and others we talked abut the relativity of
information and the fact that information is not an absolute. Here is the 
abstract of the paper and an excerpt from the paper that discusses the 
relativity of information. The full papers available at: 
https://www.academia.edu/783503/Propagating_organization_an_enquiry

Best wishes - Bob Logan


Kauffman, Stuart, Robert K. Logan, Robert Este, Randy Goebel, David Hobill and 
Ilya Smulevich. 2007. Propagating Organization: An Inquiry. Biology and 
Philosophy 23: 27-45.

Propagating Organization: An Enquiry - 
Stuart Kauffman, Robert K. Logan, Robert Este, Randy Goebel, David Hobill and 
lIlya Shmulevich

Institute for Systems Biology, Seattle Washington

 Abstract: Our aim in this article is to attempt to discuss propagating 
organization of process, a poorly articulated union of matter, energy, work, 
constraints and that vexed concept, “information”, which unite in far from 
equilibrium living physical systems. Our hope is to stimulate discussions by 
philosophers of biology and biologists to further clarify the concepts we 
discuss here. We place our discussion in the broad context of a “general 
biology”, properties that might well be found in life anywhere in the cosmos, 
freed from the specific examples of terrestrial life after 3.8 billion years of 
evolution. By placing the discussion in this wider, if still hypothetical, 
context, we also try to place in context some of the extant discussion of 
information as intimately related to DNA, RNA and protein transcription and 
translation processes. While characteristic of current terrestrial life, there 
are no compelling grounds to suppose the same mechanisms would be involved in 
any life form able to evolve by heritable variation and natural selection. In 
turn, this allows us to discuss at least briefly, the focus of much of the 
philosophy of biology on population genetics, which, of course, assumes DNA, 
RNA, proteins, and other features of terrestrial life. Presumably, evolution by 
natural selection – and perhaps self-organization - could occur on many worlds 
via different causal mechanisms.
Here we seek a non-reductionist explanation for the synthesis, accumulation, 
and propagation of information, work, and constraint, which we hope will 
provide some insight into both the biotic and abiotic universe, in terms of 
both molecular self reproduction and the basic work energy cycle where work is 
the constrained release of energy into a few degrees of freedom. The typical 
requirement for work itself is to construct those very constraints on the 
release of energy that then constitute further work. Information creation, we 
argue, arises in two ways: first information as natural selection assembling 
the very constraints on the release of energy that then constitutes work and 
the propagation of organization. Second, information in a more extended sense 
is “semiotic”, that is about the world or internal state of the organism and 
requires appropriate response. The idea is to combine ideas from biology, 
physics, and computer science, to formulate explanatory hypotheses on how 
information can be captured and rendered in the expected physical 
manifestation, which can then participate in the propagation of the 
organization of process in the expected biological work cycles to create the 
diversity in our observable biosphere.
Our conclusions, to date, of this enquiry suggest a foundation which views 
information as the construction of constraints, which, in their physical 
manifestation, partially underlie the processes of evolution to dynamically 
determine the fitness of organisms within the context of a biotic universe.

Section 4. The Relativity of Information
 In Sections 2 we have argued that the Shannon conception of information are 
not directly suited to describe the information of autonomous agents that 
propagate their organization. In Section 3 we have defined a new form of 
information, instructional or biotic information as the constraints that direct 
the flow of free energy to do work.
The reader may legitimately ask the question “isn’t informatio

Re: [Fis] A provocative issue

2016-12-11 Thread John Collier
Arturo, List:

This is a view that was fairly common, especially associated with Edwin Jaynes, 
but the other view has also been put forward by people like Brillouin and, more 
recently, John Wheeler, Murray Gell-Mann and Seth Lloyd, for example. 
Cosmologist David Layzer is another example. Interesting that they are all 
physicists.

My PhD student, Scott Muller, published a book based on his dissertation, 
Asymmetry: The Foundation of Information, (Springer 2007) that uses Jaynes’ 
notion of an IGUS together with group theory to define the amount of 
information in an object (I have a different way of doing that). Jaynes held 
that each IGUS had its own measure of information in something, and there was 
no common measure. Scott argued that you can combine the information measured 
by all possible IGUSs (sort of like observers or interactors, but more strictly 
defined) to get the information in the object. I define it as the minimal 
number of yes-no questions required to completely describe the thing. The two 
should be equivalent. So you are siding with Jaynes, I think. I think Scott 
nailed the idea of objective intrinsic information on solid ground.

By the way, Shannon’s measure is of the information capacity of a channel. 
There are better ways to define the information in a real situation (e.g., the 
computational notion of information), but Shannon’s approach can be adapted to 
give the same result with some relatively intuitive assumptions.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of 
tozziart...@libero.it
Sent: Sunday, 11 December 2016 5:57 PM
To: fis@listas.unizar.es
Subject: [Fis] A provocative issue


Dear FISers,
I know that some of you are going to kill me, but there’s something that I must 
confess.
I notice, from the nice issued raised by Francesco Rizzo, Joseph Brenner, John 
Collier, that the main concerns are always energetic/informational arguments 
and accounts.
Indeed, the current tenets state that all is information, information being a 
real quantity that can be measured through informational entropies.
But… I ask to myself, is such a tenet true?
When I cook the pasta, I realize that, by my point of view, the cooked pasta 
encompasses more information than the not-cooked one, because it acquires the 
role of something that I can eat in order to increase my possibility to 
preserve myself in the hostile environment that wants to destroy me.  However, 
by the point of view of the bug who eats the non-cooked pasta, my cooked pasta 
displays less information for sure.  Therefore, information is a very 
subjective measure that, apart from its relationship with the observer, does 
not mean very much…  Who can state that an event or a fact displays more 
information than another one?
And, please, do not counteract that information is a quantifiable, objective 
reality, because it can be measured through informational entropy… 
Informational entropy, in its original Shannon’s formulation, stands for an 
ergodic process (page 8 of the original 1948 Shannon’s seminal paper), i.e.: 
every sequence produced by the processes is the same in statistical properties, 
or, in other words, a traveling particle always crosses all the points of its 
phase space.  However, in physics and biology, the facts and events are never 
ergodic.  Statistical homogeneity is just a fiction, if we evaluate the world 
around us and our brain/mind.
Therefore, the role of information could not be as fundamental as currently 
believed.

P.S.: topology analyzes information by another point of view, but it’s an issue 
for the next time, I think…




Arturo Tozzi

AA Professor Physics, University North Texas

Pediatrician ASL Na2Nord, Italy

Comput Intell Lab, University Manitoba

http://arturotozzi.webnode.it/

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] A provocative issue

2016-12-11 Thread Bob Logan
Bravo Arturo - I totally agree - in a paper I co-authored with Stuart Kauffman 
and others we talked abut the relativity of 
information and the fact that information is not an absolute. Here is the 
abstract of the paper and an excerpt from the paper that discusses the 
relativity of information. The full papers available at: 
https://www.academia.edu/783503/Propagating_organization_an_enquiry

Best wishes - Bob Logan

Kauffman, Stuart, Robert K. Logan, Robert Este, Randy Goebel, David Hobill and 
Ilya Smulevich. 2007. Propagating Organization: An Inquiry. Biology and 
Philosophy 23: 27-45.

Propagating Organization: An Enquiry - 
Stuart Kauffman, Robert K. Logan, Robert Este, Randy Goebel, David Hobill and 
lIlya Shmulevich

Institute for Systems Biology, Seattle Washington

 Abstract: Our aim in this article is to attempt to discuss propagating 
organization of process, a poorly articulated union of matter, energy, work, 
constraints and that vexed concept, “information”, which unite in far from 
equilibrium living physical systems. Our hope is to stimulate discussions by 
philosophers of biology and biologists to further clarify the concepts we 
discuss here. We place our discussion in the broad context of a “general 
biology”, properties that might well be found in life anywhere in the cosmos, 
freed from the specific examples of terrestrial life after 3.8 billion years of 
evolution. By placing the discussion in this wider, if still hypothetical, 
context, we also try to place in context some of the extant discussion of 
information as intimately related to DNA, RNA and protein transcription and 
translation processes. While characteristic of current terrestrial life, there 
are no compelling grounds to suppose the same mechanisms would be involved in 
any life form able to evolve by heritable variation and natural selection. In 
turn, this allows us to discuss at least briefly, the focus of much of the 
philosophy of biology on population genetics, which, of course, assumes DNA, 
RNA, proteins, and other features of terrestrial life. Presumably, evolution by 
natural selection – and perhaps self-organization - could occur on many worlds 
via different causal mechanisms.

Here we seek a non-reductionist explanation for the synthesis, accumulation, 
and propagation of information, work, and constraint, which we hope will 
provide some insight into both the biotic and abiotic universe, in terms of 
both molecular self reproduction and the basic work energy cycle where work is 
the constrained release of energy into a few degrees of freedom. The typical 
requirement for work itself is to construct those very constraints on the 
release of energy that then constitute further work. Information creation, we 
argue, arises in two ways: first information as natural selection assembling 
the very constraints on the release of energy that then constitutes work and 
the propagation of organization. Second, information in a more extended sense 
is “semiotic”, that is about the world or internal state of the organism and 
requires appropriate response. The idea is to combine ideas from biology, 
physics, and computer science, to formulate explanatory hypotheses on how 
information can be captured and rendered in the expected physical 
manifestation, which can then participate in the propagation of the 
organization of process in the expected biological work cycles to create the 
diversity in our observable biosphere.

Our conclusions, to date, of this enquiry suggest a foundation which views 
information as the construction of constraints, which, in their physical 
manifestation, partially underlie the processes of evolution to dynamically 
determine the fitness of organisms within the context of a biotic universe.


Section 4. The Relativity of Information

 In Sections 2 we have argued that the Shannon conception of information are 
not directly suited to describe the information of autonomous agents that 
propagate their organization. In Section 3 we have defined a new form of 
information, instructional or biotic information as the constraints that direct 
the flow of free energy to do work.

The reader may legitimately ask the question “isn’t information just 
information?”, i.e., an invariant like the speed of light. Our response to this 
question is no, and to then clarify what seems arbitrary about the definition 
of information. Instructional or biotic information is a useful definition for 
biotic systems just as Shannon information was useful for telecommunication 
channel engineering, and Kolmogorov (Shiryayev 1993) information was useful for 
the study of information compression with respect to Turing machines.

The definition of information is relative and depends on the context in which 
it is to be considered. There appears to be no such thing as absolute 
information that is an invariant that applies to all circumstances. Just as 
Shannon defined information