Re: [Fis] Is information physical? 'Signs rust.'

2018-04-26 Thread Mark Johnson
Dear Joseph,

Thank you for this beautiful summary.

That describes the world doesn't it? (it also describes music - which is a good 
sign). 

I want to say why information matters to me, not to argue about what it is. 

Information matters because it enables these conversations which dissolve 
barriers between disciplines, and ultimately has the capacity to dissolve 
barriers between each of us.

Information is such a powerful concept because everyone thinks they know what 
it is. Really, the conversation is the important thing. We may think we argue, 
but we are all in this dance together. It's always a privilege to have one's 
certainties shattered - who'd have thought the information in email messages 
could be so powerful?!

Best wishes,

Mark

-Original Message-
From: "joe.bren...@bluewin.ch" 
Sent: ‎26/‎04/‎2018 15:33
To: "u...@umces.edu" 
Cc: "fis@listas.unizar.es" 
Subject: Re: [Fis] Is information physical? 'Signs rust.'

Information refers to changes in patterns of energy flow, some slow (frozen), 
some fast, some quantitative and measurable, some qualitative and 
non-measurable, some meaningful and some meaningless, partly causally effective 
and partly inert, partly present and partly absent, all at the same time.

Best wishes,

Joseph

>Message d'origine
>De : u...@umces.edu
>Date : 25/04/2018 - 08:14 (PDT)
>À : mbur...@math.ucla.edu
>Cc : fis@listas.unizar.es
>Objet : Re: [Fis] Is information physical?
>
>Dear Mark,
>
>I share your inclination, albeit from a different perspective.
>
>Consider the two statements:
>
>1. Information is impossible without a physical carrier.
>
>2. Information is impossible without the influence of that which does not 
>exist.
>
>There is significant truth in both statements.
>
>I know that Claude Shannon is not a popular personality on FIS, but I
>admire how he first approached the subject. He began by quantifying,
>not information in the intuitive, positivist  sense, but rather the
>*lack* of information, or "uncertainty", as he put it. Positivist
>information thereby becomes a double negative -- any decrease in
>uncertainty.
>
>In short, the quantification of information begins by quantifying
>something that does not exist, but nonetheless is related to that
>which does. Terry calls this lack the "absential", I call it the
>"apophatic" and it is a major player in living systems!
>
>Karl Popper finished his last book with the exhortation that we need
>to develop a "calculus of conditional probabilities". Well, that
>effort was already underway in information theory. Using conditional
>probabilities allows one to parse Shannon's formula for diversity into
>two terms -- on being positivist information (average mutual
>information) and the other apophasis (conditional entropy).
>
>
>This duality in nature is evident but often unnoticed in the study of
>networks. Most look at networks and immediately see the constraints
>between nodes. And so it is. But there is also indeterminacy in almost
>all real networks, and this often is disregarded. The proportions
>between constraint and indeterminacy can readily be calculated.
>
>What is important in living systems (and I usually think of the more
>indeterminate ecosystems, rather than organisms [but the point applies
>there as well]) is that some degree of conditional entropy is
>absolutely necessary for systems sustainability, as it provides the
>flexibility required to construct new responses to novel challenges.
>
>While system constraint usually abets system performance, systems that
>become too efficient do so by decreasing their (mutually exclusive)
>flexibility and become progressively vulnerable to collapse.
>
>The lesson for evolutionary theory is clear. Survival is not always a
>min/max (fitt*est*) issue. It is about a balance between adaptation
>and adaptability. Ecosystems do not attain maximum efficiency. To do
>so would doom them.
> The balance also
>puts the lie to a major maxim of economics, which is that nothing
>should hinder the efficiency of the market. That's a recipe for "boom
>and bust". 
>
>Mark, I do disagree with your opinion that information cannot be
>measured. The wider application of information theory extends beyond
>communication and covers the information inherent in structure, or
>what John Collier calls "enformation". Measurement is extremely
>important there. Perhaps you are disquieted by the relative nature of
>information measurements. Such relativity is inevitable. Information
>can only be measured with respect to some (arbitrary) reference
>distribution (which is also known in the wider realm of thermodynamics
>as "the third law".)
>
>Remember how Bateson pointed to the overwhelmingly positivist nature
>of physics. Classical physics is deficient in its lack of recognition
>of the apophatic. Information theory cure

Re: [Fis] Is information physical? 'Signs rust.'

2018-04-26 Thread joe.bren...@bluewin.ch
Information refers to changes in patterns of energy flow, some slow (frozen), 
some fast, some quantitative and measurable, some qualitative and 
non-measurable, some meaningful and some meaningless, partly causally effective 
and partly inert, partly present and partly absent, all at the same time.

Best wishes,

Joseph

>Message d'origine
>De : u...@umces.edu
>Date : 25/04/2018 - 08:14 (PDT)
>À : mbur...@math.ucla.edu
>Cc : fis@listas.unizar.es
>Objet : Re: [Fis] Is information physical?
>
>Dear Mark,
>
>I share your inclination, albeit from a different perspective.
>
>Consider the two statements:
>
>1. Information is impossible without a physical carrier.
>
>2. Information is impossible without the influence of that which does not 
>exist.
>
>There is significant truth in both statements.
>
>I know that Claude Shannon is not a popular personality on FIS, but I
>admire how he first approached the subject. He began by quantifying,
>not information in the intuitive, positivist  sense, but rather the
>*lack* of information, or "uncertainty", as he put it. Positivist
>information thereby becomes a double negative -- any decrease in
>uncertainty.
>
>In short, the quantification of information begins by quantifying
>something that does not exist, but nonetheless is related to that
>which does. Terry calls this lack the "absential", I call it the
>"apophatic" and it is a major player in living systems!
>
>Karl Popper finished his last book with the exhortation that we need
>to develop a "calculus of conditional probabilities". Well, that
>effort was already underway in information theory. Using conditional
>probabilities allows one to parse Shannon's formula for diversity into
>two terms -- on being positivist information (average mutual
>information) and the other apophasis (conditional entropy).
>
>
>This duality in nature is evident but often unnoticed in the study of
>networks. Most look at networks and immediately see the constraints
>between nodes. And so it is. But there is also indeterminacy in almost
>all real networks, and this often is disregarded. The proportions
>between constraint and indeterminacy can readily be calculated.
>
>What is important in living systems (and I usually think of the more
>indeterminate ecosystems, rather than organisms [but the point applies
>there as well]) is that some degree of conditional entropy is
>absolutely necessary for systems sustainability, as it provides the
>flexibility required to construct new responses to novel challenges.
>
>While system constraint usually abets system performance, systems that
>become too efficient do so by decreasing their (mutually exclusive)
>flexibility and become progressively vulnerable to collapse.
>
>The lesson for evolutionary theory is clear. Survival is not always a
>min/max (fitt*est*) issue. It is about a balance between adaptation
>and adaptability. Ecosystems do not attain maximum efficiency. To do
>so would doom them.
> The balance also
>puts the lie to a major maxim of economics, which is that nothing
>should hinder the efficiency of the market. That's a recipe for "boom
>and bust". 
>
>Mark, I do disagree with your opinion that information cannot be
>measured. The wider application of information theory extends beyond
>communication and covers the information inherent in structure, or
>what John Collier calls "enformation". Measurement is extremely
>important there. Perhaps you are disquieted by the relative nature of
>information measurements. Such relativity is inevitable. Information
>can only be measured with respect to some (arbitrary) reference
>distribution (which is also known in the wider realm of thermodynamics
>as "the third law".)
>
>Remember how Bateson pointed to the overwhelmingly positivist nature
>of physics. Classical physics is deficient in its lack of recognition
>of the apophatic. Information theory cures that.
>
>Yes, information requires a material carrier. It also is intimately
>affected by and requires nonmaterial apophasis.
>
>Best wishes,
>Bob
>
>On 4/24/18, Burgin, Mark  wrote:
>> Dear Colleagues,
>>
>> I would like to suggest the new topic for discussion
>>
>>Is information physical?
>>
>> My opinion is presented below:
>>
>> Why some people erroneously think that information is physical
>>
>> The main reason to think that information is physical is the strong
>> belief of many people, especially, scientists that there is only
>> physical reality, which is studied by science. At the same time, people
>> encounter something that they call information.
>>
>> When people receive a letter, they comprehend that it is information
>> because with the letter they receive information. The letter is
>> physical, i.e., a physical object. As a result, people start thinking
>> that information is physical