Arturo, List:

This is a view that was fairly common, especially associated with Edwin Jaynes, 
but the other view has also been put forward by people like Brillouin and, more 
recently, John Wheeler, Murray Gell-Mann and Seth Lloyd, for example. 
Cosmologist David Layzer is another example. Interesting that they are all 
physicists.

My PhD student, Scott Muller, published a book based on his dissertation, 
Asymmetry: The Foundation of Information, (Springer 2007) that uses Jaynes’ 
notion of an IGUS together with group theory to define the amount of 
information in an object (I have a different way of doing that). Jaynes held 
that each IGUS had its own measure of information in something, and there was 
no common measure. Scott argued that you can combine the information measured 
by all possible IGUSs (sort of like observers or interactors, but more strictly 
defined) to get the information in the object. I define it as the minimal 
number of yes-no questions required to completely describe the thing. The two 
should be equivalent. So you are siding with Jaynes, I think. I think Scott 
nailed the idea of objective intrinsic information on solid ground.

By the way, Shannon’s measure is of the information capacity of a channel. 
There are better ways to define the information in a real situation (e.g., the 
computational notion of information), but Shannon’s approach can be adapted to 
give the same result with some relatively intuitive assumptions.

John Collier
Emeritus Professor and Senior Research Associate
Philosophy, University of KwaZulu-Natal
http://web.ncf.ca/collier

From: Fis [mailto:fis-boun...@listas.unizar.es] On Behalf Of 
tozziart...@libero.it
Sent: Sunday, 11 December 2016 5:57 PM
To: fis@listas.unizar.es
Subject: [Fis] A provocative issue


Dear FISers,
I know that some of you are going to kill me, but there’s something that I must 
confess.
I notice, from the nice issued raised by Francesco Rizzo, Joseph Brenner, John 
Collier, that the main concerns are always energetic/informational arguments 
and accounts.
Indeed, the current tenets state that all is information, information being a 
real quantity that can be measured through informational entropies.
But… I ask to myself, is such a tenet true?
When I cook the pasta, I realize that, by my point of view, the cooked pasta 
encompasses more information than the not-cooked one, because it acquires the 
role of something that I can eat in order to increase my possibility to 
preserve myself in the hostile environment that wants to destroy me.  However, 
by the point of view of the bug who eats the non-cooked pasta, my cooked pasta 
displays less information for sure.  Therefore, information is a very 
subjective measure that, apart from its relationship with the observer, does 
not mean very much…  Who can state that an event or a fact displays more 
information than another one?
And, please, do not counteract that information is a quantifiable, objective 
reality, because it can be measured through informational entropy… 
Informational entropy, in its original Shannon’s formulation, stands for an 
ergodic process (page 8 of the original 1948 Shannon’s seminal paper), i.e.: 
every sequence produced by the processes is the same in statistical properties, 
or, in other words, a traveling particle always crosses all the points of its 
phase space.  However, in physics and biology, the facts and events are never 
ergodic.  Statistical homogeneity is just a fiction, if we evaluate the world 
around us and our brain/mind.
Therefore, the role of information could not be as fundamental as currently 
believed.

P.S.: topology analyzes information by another point of view, but it’s an issue 
for the next time, I think…




Arturo Tozzi

AA Professor Physics, University North Texas

Pediatrician ASL Na2Nord, Italy

Comput Intell Lab, University Manitoba

http://arturotozzi.webnode.it/

_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to