Dear Mark,

I share your inclination, albeit from a different perspective.

Consider the two statements:

1. Information is impossible without a physical carrier.

2. Information is impossible without the influence of that which does not exist.

There is significant truth in both statements.

I know that Claude Shannon is not a popular personality on FIS, but I
admire how he first approached the subject. He began by quantifying,
not information in the intuitive, positivist  sense, but rather the
*lack* of information, or "uncertainty", as he put it. Positivist
information thereby becomes a double negative -- any decrease in
uncertainty.

In short, the quantification of information begins by quantifying
something that does not exist, but nonetheless is related to that
which does. Terry calls this lack the "absential", I call it the
"apophatic" and it is a major player in living systems!

Karl Popper finished his last book with the exhortation that we need
to develop a "calculus of conditional probabilities". Well, that
effort was already underway in information theory. Using conditional
probabilities allows one to parse Shannon's formula for diversity into
two terms -- on being positivist information (average mutual
information) and the other apophasis (conditional entropy).
<https://people.clas.ufl.edu/ulan/files/FISPAP.pdf>

This duality in nature is evident but often unnoticed in the study of
networks. Most look at networks and immediately see the constraints
between nodes. And so it is. But there is also indeterminacy in almost
all real networks, and this often is disregarded. The proportions
between constraint and indeterminacy can readily be calculated.

What is important in living systems (and I usually think of the more
indeterminate ecosystems, rather than organisms [but the point applies
there as well]) is that some degree of conditional entropy is
absolutely necessary for systems sustainability, as it provides the
flexibility required to construct new responses to novel challenges.

While system constraint usually abets system performance, systems that
become too efficient do so by decreasing their (mutually exclusive)
flexibility and become progressively vulnerable to collapse.

The lesson for evolutionary theory is clear. Survival is not always a
min/max (fitt*est*) issue. It is about a balance between adaptation
and adaptability. Ecosystems do not attain maximum efficiency. To do
so would doom them.
<https://people.clas.ufl.edu/ulan/files/ECOCOMP2.pdf> The balance also
puts the lie to a major maxim of economics, which is that nothing
should hinder the efficiency of the market. That's a recipe for "boom
and bust". <https://people.clas.ufl.edu/ulan/files/Crisis.pdf>

Mark, I do disagree with your opinion that information cannot be
measured. The wider application of information theory extends beyond
communication and covers the information inherent in structure, or
what John Collier calls "enformation". Measurement is extremely
important there. Perhaps you are disquieted by the relative nature of
information measurements. Such relativity is inevitable. Information
can only be measured with respect to some (arbitrary) reference
distribution (which is also known in the wider realm of thermodynamics
as "the third law".)

Remember how Bateson pointed to the overwhelmingly positivist nature
of physics. Classical physics is deficient in its lack of recognition
of the apophatic. Information theory cures that.

Yes, information requires a material carrier. It also is intimately
affected by and requires nonmaterial apophasis.

Best wishes,
Bob

On 4/24/18, Burgin, Mark <mbur...@math.ucla.edu> wrote:
> Dear Colleagues,
>
> I would like to suggest the new topic for discussion
>
>                                        Is information physical?
>
> My opinion is presented below:
>
> Why some people erroneously think that information is physical
>
> The main reason to think that information is physical is the strong
> belief of many people, especially, scientists that there is only
> physical reality, which is studied by science. At the same time, people
> encounter something that they call information.
>
> When people receive a letter, they comprehend that it is information
> because with the letter they receive information. The letter is
> physical, i.e., a physical object. As a result, people start thinking
> that information is physical. When people receive an e-mail, they
> comprehend that it is information because with the e-mail they receive
> information. The e-mail comes to the computer in the form of
> electromagnetic waves, which are physical. As a result, people start
> thinking even more that information is physical.
>
> However, letters, electromagnetic waves and actually all physical
> objects are only carriers or containers of information.
>
> To understand this better, let us consider a textbook. Is possible to
> say that this book is knowledge? Any reasonable person will tell that
> the textbook contains knowledge but is not knowledge itself. In the same
> way, the textbook contains information but is not information itself.
> The same is true for letters, e-mails, electromagnetic waves and other
> physical objects because all of them only contain information but are
> not information. For instance, as we know, different letters can contain
> the same information. Even if we make an identical copy of a letter or
> any other text, then the letter and its copy will be different physical
> objects (physical things) but they will contain the same information.
>
> Information belongs to a different (non-physical) world of knowledge,
> data and similar essences. In spite of this, information can act on
> physical objects (physical bodies) and this action also misleads people
> who think that information is physical.
>
> One more misleading property of information is that people can measure
> it. This brings an erroneous assumption that it is possible to measure
> only physical essences. Naturally, this brings people to the erroneous
> conclusion that information is physical. However, measuring information
> is essentially different than measuring physical quantities, i.e.,
> weight. There are no “scales” that measure information. Only human
> intellect can do this.
>
> It is possible to find more explanations that information is not
> physical in the general theory of information.
>
> Sincerely,
> Mark Burgin
>
>
> On 4/24/2018 10:46 AM, Pedro C. Marijuan wrote:
>> Dear FIS Colleagues,
>>
>> A very interesting discussion theme has been proposed by Mark Burgin
>> --he will post at his early convenience.
>> Thanks are due to Alberto for his "dataism" piece. Quite probably we
>> will need to revisit that theme, as it is gaining increasing momentum
>> in present "information societies", in science as well as in everyday
>> life...
>> Thanks also to Sung for his interesting viewpoint and references.
>>
>> Best wishes to all,
>> --Pedro
>>
>>
>> -------------------------------------------------
>> Pedro C. Marijuán
>> Grupo de Bioinformación / Bioinformation Group
>> pcmarijuan.i...@aragon.es
>> http://sites.google.com/site/pedrocmarijuan/
>> -------------------------------------------------
>>
>> <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient>
>>
>>      Libre de virus. www.avast.com
>> <https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient>
>>
>>
>>
>> <#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
>>
>>
>> _______________________________________________
>> Fis mailing list
>> Fis@listas.unizar.es
>> http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>
>

_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to