Dear Sung, 
I'm sorry, but the "Unreasonable Effectiveness of Mathematics" still holds 
true.  
Forget philosophical concepts like Yin and Yang, because, in some cases and 
contexts , entropy is negative.  
Just to make an example,
"Since the entropy H(S|O) can now become negative, erasing a system can result 
in a net gain of work (and a corresponding cooling of the environment)."
https://www.nature.com/nature/journal/v474/n7349/full/nature10123.html
--
Inviato da Libero Mail per Android venerdì, 13 ottobre 2017, 10:11PM +02:00 da 
Sungchul Ji  s...@pharmacy.rutgers.edu :

>Hi Arturo,
>
>( 1 )  I don't understand where you got (or how you can justify) S = 1 J/K in 
>your statement,
>
>" With the same probability mass function, you can see that H = S/(ln(2)*k B ),
> so setting S = 1J/K gives a Shannon entropy of 1.045×10 23  bits."
>
>( 2 ) I can see how one can get H = S/(ln(2)*k_B mathematically, but what does 
>this equality mean physically
> ?
>( 3 ) This reminds me of what Schroedinger did when he came up with the 
>conclusion that "negative entropy" is
> equivalent to "order", which led to Brillouin's so-called the "negentropy 
> Principle of Information (NPI)" [1, 2].
>
>Simply by multiplying the both sides of the Boltzmann equation with negative 
>one, Schroedinger obtained the following formula:
>
> - S = - k lnW = k ln (1/W)
>
>and then equating W with disorder, D, led him to 
>
>- S = k ln (1/D).
>
>Since (1/D) can be interpreted as the opposite of "disorder", namely, "order", 
>he concluded that
>
>"negative entropy = order".
>
>As you can see, the above derivation is mathematically sound but the result 
>violates the Third Law of Thermodynamics,
> according to which thermodynamic entropy cannot be less than zero.
>
>Thus, in 2012 I was led to formulate what I called the "Schroedinger paradox" 
>as follows [3]
>
>"Schroedinger's paradox refers to the mathematical equations, concepts, or 
>general statements that are formally true
> but physically meaningless." 
>
>( 4 ) If my argument in ( 3 ) is valid, this may provide an example of what 
>may be called 
>
>the " Unreasonable Ineffectiveness of Mathematics "
>
>which, together with Wigner's " Unreasonable Effectiveness of Mathematics ", 
>may constitute an Yin-Yang pair
> of mathematics.  
>
>All the best.
>
>Sung
>
>
>
> 
>
>
>
>
>
>References:
>   [1]   Brillouin, L. (1953).  Negentropy Principle of Information, J. 
>Applied Phys. 24 (9),
> 1152- 1163.
>   [2]  Brillouin, L. (1956).  Science and Information Theory, Academic Press, 
>Inc., New York, pp. 152-156.
>   [3] Ji, S. (2012).   The Third Law of  Thermodynamics  and  “Schroedinger’s
> Paradox” .  In: Molecular Theory of the Living
> Cell: Concepts, Molecular Mechanisms, and Biomedical Applications.   
> Springer, New York.  pp. 12-15.  
> PDF at  http://www.conformon.net/wp-content/uploads/2014/03/Schroedinger_pa 
> radox.pdf
>
> 
>
>
>
>
>----------------------------------------------------------------------
>From: tozziart...@libero.it < tozziart...@libero.it >
>Sent: Friday, October 13, 2017 4:43 AM
>To: Sungchul Ji;  fis@listas.unizar.es
>Subject: R: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION
> 
>Dear Sung, 
>One J/K corresponds to 1.045×10 23  bits.
>
>Indeed, 
>The Gibbs entropy formula states that thermodynamic entropy S equals k B 
>*sum[p i *ln(1/p i )],
> with units of J/K, where k B  is
> the Boltzmann constant and p i  is
> the probability of microstate i. On the other hand, the Shannon entropy is 
> defined as H = sum[p i *log 2 (1/p i )],
> with units of bits. With the same probability mass function, you can see that 
> H = S/(ln(2)*k B ),
> so setting S = 1J/K gives a Shannon entropy of 1.045×10 23  bits.
>
>On the other side, The energy consumption per bit of data on  the Internet is 
>around 75 μJ at low access rates and decreases  to
> around 2-4 μJ at an access rate of 100 Mb/s.
>see: 
>http://www.ing.unitn.it/~fontana/GreenInternet/Recent%20Papers%20and%20p2p/Baliga_Ayre_Hinton_Sorin_Tucker_JLT0
> . pdf
>
>Futher,  according to Landauer's theory, a minimum amount of heat – roughly 10 
>–21  J
> per erased bit – must be dissipated when information is destroyed.
>http://physicsworld.com/cws/article/news/2012/mar/12/wiping-data-will-cost-you-energy
>
>
>In other words, summarizing, if you use the free energy to assess the 
>information, it works the same, giving a quantifiable value.  
>
>
>Arturo Tozzi
>AA Professor Physics, University North Texas
>Pediatrician ASL Na2Nord, Italy
>Comput Intell Lab, University Manitoba
>http://arturotozzi.webnode.it/  
>
>
>>----Messaggio originale----
>>Da: "Sungchul Ji" < s...@pharmacy.rutgers.edu >
>>Data: 12/10/2017 22.08
>>A: "Francesco Rizzo"< 13francesco.ri...@gmail.com >, "Pedro C. Marijuan"< 
>>pcmarijuan.i...@aragon.es >
>>Cc: "fis@listas.unizar.es >> fis@listas.unizar.es"< fis@listas.unizar.es >
>>Ogg: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION
>>
>>Hi FISers,
>>
>>The following statement cannot be true.
>>>"a proposal: information might stand for free energy."  
>>Fore one thing, the unit of information is bits and that of energy is cal or 
>>erg.
>>
>>The proper relation between information and energy (including free energy) 
>>may be complementarity, just as is the relation between wave and particle.   
>>According to the ITR (Irreducible Triadic Relation) model of of signs and 
>>communication,
>> information and energy are entangled in the sense that both are 
>> irreplaceably implicated in the process of communication. Both information 
>> and energy are  needed for communication, the minimum energy cost of 
>> transmitting one bit of information being ~ 0.6
>> Kcal/mole, according to Shannon.
>>
>>All the best.
>>
>>Sung
>>
>>     
>>
>>----------------------------------------------------------------------
>>From: Fis < fis-boun...@listas.unizar.es > on behalf of Francesco Rizzo < 
>>13francesco.ri...@gmail.com >
>>Sent: Thursday, October 12, 2017 3:00 AM
>>To: Pedro C. Marijuan
>>Cc: fis@listas.unizar.es >>  fis@listas.unizar.es
>>Subject: Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION
>> 
>>Caro Pedro e cari tutti,
>>gli ingressi e le uscite delle cellule viventi con l'ambiente, non sono altro 
>>che materia, energia e informazione che entrano (INPUT) ed escono (OUTPUT)  
>>dando luogo al processo di TRAS-IN-FORM-AZIONE che ho elaborato nella Nuova 
>>Economia a proposito dei
>> sistemi produttivi entropici (energia degradata o dis-informazione) e 
>> neg-entropici (energia libera o informazione) che hanno un carattere 
>> generale. Tanto è vero che circa 20 anni fa ho applicato e riferito alla 
>> cellula che stabilisce con l'ambiente (biologico-naturale)
>> un rapporto simile a quello che l'intrapresa (azienda) stabilisce con 
>> l'ambiente (sociale-economico). In fondo la bio-chimica e l'economia 
>> risultano complementari nella vita degli uomini la cui esistenza e 
>> conoscenza possono ben comprendersi secondo la onto-logica
>> empirica o concreta, altrimenti detta LIR, che la generosità di Joseph 
>> Brenner ha intravisto anche nella mia analisi scientifica. Purtroppo  questa 
>> problematica, ben espressa e sintetizzata dal processo di 
>> TRAS-IN-FORM-AZIONE e più volte oggetto di confronto
>> e discussione nel dibattito Fis, è poco conosciuta perchè si ritrova esposta 
>> in una ventina dei miei libri scritti in italiano.
>>Comunque il TEMPO è (sempre galantuomo e fornisce) l'INFORMAZIONE giusta 
>>svolgendo la funzione della LINGUA delle LINGUE che tutti possono 
>>com-prendere, prima o poi. Grazie, per l'opportunità che mi date a partire da 
>>Pedro che ha il grande merito dell'iniziazione-mediazione
>> in tal senso.
>>Un abbraccio, Francesco Rizzo.
>>
>>
>>2017-10-11 14:30 GMT+02:00 Pedro C. Marijuan  < pcmarijuan.i...@aragon.es > :
>>>Dear Arturo and colleagues,
>>>
>>>I think that relating information to free energy can be a good idea. I am 
>>>not sure whether the expressions derived from Gibbs free energy (below) have 
>>>sufficient generality; at least they work very well for chemical reactions. 
>>>And it is in the biomolecular
>>> (chemical) realm where the big divide between "animate information" and 
>>> "inanimate information" occurs. In that sense, I include herein the scheme 
>>> we have just published of prokaryotic cells in their management of the 
>>> "information flow". In a next message
>>> I will make suggestions on how the mapping of biological information may 
>>> conduce to a more general approach that includes the other varieties of 
>>> information (anthropocentric, physical, chemical, cosmological, etc). 
>>> Biological information is the most fundamental
>>> and radical track to unite the different approaches! 
>>>
>>>Best--Pedro
>>>
>>>Pedro C. Marijuán , Jorge Navarro ,
>>> Raquel del Moral. 
>>>How prokaryotes ‘encode’ their environment: Systemic tools for organizing 
>>>the information flow.
>>>Biosystems .
>>> October    2017.  https://doi.org/10.1016/j.biosystems.2017.10.002
>>>
>>>Abstract
>>>An important issue related to code biology concerns the cell’s informational 
>>>relationships with the environment. As an open self-producing system, a 
>>>great variety of inputs and outputs are necessary for the living cell, not 
>>>only consisting of matter and energy
>>> but also involving information flows. The analysis here of the simplest 
>>> cells will involve two basic aspects. On the one side, the molecular 
>>> apparatuses of the prokaryotic signaling system, with all its variety of 
>>> environmental signals and component pathways
>>> (which have been called 1–2-3 Component Systems), including the role of a 
>>> few second messengers which have been pointed out in bacteria too. And in 
>>> the other side, the gene transcription system as depending not only on 
>>> signaling inputs but also on a diversity
>>> of factors. Amidst the continuum of energy, matter, and information flows, 
>>> there seems to be evidence for signaling codes, mostly established around 
>>> the arrangement of life-cycle stages, in large metabolic changes, or in the 
>>> relationships with conspecifics
>>> (quorum sensing) and within microbial ecosystems. Additionally, and 
>>> considering the complexity growth of signaling systems from prokaryotes to 
>>> eukaryotes, four avenues or “roots” for the advancement of such complexity 
>>> would come out. A comparative will be
>>> established in between the signaling strategies and organization of both 
>>> kinds of cellular systems. Finally, a new characterization of 
>>> “informational architectures” will be proposed in order to explain the 
>>> coding spectrum of both prokaryotic and eukaryotic
>>> signaling systems. Among other evolutionary aspects, cellular strategies 
>>> for the construction of novel functional codes via the intermixing of 
>>> informational architectures could be related to the persistence of 
>>> retro-elements with obvious viral ancestry.
>>>-------------------------------------------
>>>
>>>
>>>El 10/10/2017 a las 11:14,  tozziart...@libero.it escribió:
>>>>Dear FISers, 
>>>>a proposal: information might stand for free energy.  
>>>>
>>>>Indeed, we know that, for an engine: 
>>>>enthalpy = free energy + entropy x temperature.
>>>>
>>>>At a fixed temperature, 
>>>>enthalpy = free energy +entropy 
>>>>
>>>>The information detected (from an environmental object) by an observer is 
>>>>not the total possible one (the enthalpy encompassed in the object), but 
>>>>just a part, i.e., the part that it is not uncertain for him (the free 
>>>>energy).  Hence, every observer, depending
>>>> on his peculiar features, detects a different amont of free energy and 
>>>> does not detect the uncertain part (the entropy).    
>>>>
>>>>Arturo Tozzi
>>>>AA Professor Physics, University North Texas
>>>>Pediatrician ASL Na2Nord, Italy
>>>>Comput Intell Lab, University Manitoba
>>>>http://arturotozzi.webnode.it/  
>>>>
>>>>
>>>>>----Messaggio originale----
>>>>>Da: "Christophe Menant"  <christophe.men...@hotmail.fr>
>>>>>Data: 10/10/2017 11.01
>>>>>A:  "dea...@berkeley.edu" <dea...@berkeley.edu>
>>>>>Cc:  "fis@listas.unizar.es" <fis@listas.unizar.es>
>>>>>Ogg: [Fis] TR: Data - Reflection - Information
>>>>>
>>>>>
>>>>>T hanks for these comments Terry .
>>>>>
>>>>>We should indeed be careful not to focus too much on language because 
>>>>>'meaning' is not limited to human communication. And also because starting 
>>>>>at basic life level allows to address 'meaning' without the burden of 
>>>>>complex performances
>>>>> like self-consciousness or free will. (The existing bias on language may 
>>>>> come from analytic philosophy initially dealing with human performances).
>>>>>Interestingly, a quite similar comment may apply to continental philosophy 
>>>>>where the 'aboutness' of a mental state was invented for human 
>>>>>consciousness. And this is of some importance for us because 
>>>>>'intentionality' is close to 'meaning'. Happily enough 'bio-intentionality'
>>>>> is slowly becoming an acceptable entity ( 
>>>>> https://philpapers.org/rec/MENBAM-2 ).
>>>>>Regarding Peirce,  I'm a bit careful about using the triadic approach in 
>>>>>FIS because non human life was not a key subject for him and also because 
>>>>>the Interpreter which creates the meaning of the sign (the Interpretant) 
>>>>>does not seem that much explicited or detailed.
>>>>>The divisions you propose look interesting  ( intrinsic, referential, 
>>>>>normative). Would it be possible to read more on that (sorry if I have 
>>>>>missed some of your posts)? 
>>>>>Best 
>>>>>Christophe
>>>>>
>>>>>----------------------------------------------------------------------
>>>>>De : Fis <fis-boun...@listas.unizar.es> de la part de Terrence W. DEACON  
>>>>><dea...@berkeley.edu>
>>>>>Envoyé : lundi 9 octobre 2017 02:30
>>>>>À : Sungchul Ji
>>>>>Cc : foundationofinformationscience
>>>>>Objet : Re: [Fis] Data - Reflection - Information
>>>>> 
>>>>>Against "meaning"
>>>>>
>>>>>I think that there is a danger of allowing our anthropocentrism to bias 
>>>>>the discussion. I worry that the term 'meaning' carries too much of a 
>>>>>linguistic bias.
>>>>>By this I mean that it is too attractive to use language as our 
>>>>>archtypical model when we talk about information.
>>>>>Language is rather the special case, the most unusual communicative 
>>>>>adaptation to ever have evolved, and one that grows out of and depends on 
>>>>>informationa/semiotic capacities shared with other species and with 
>>>>>biology in general.
>>>>>So I am happy to see efforts to bring in topics like music or natural 
>>>>>signs like thunderstorms and would also want to cast the net well beyond 
>>>>>humans to include animal calls, scent trails, and molecular signaling by 
>>>>>hormones. And it is why I am more attracted
>>>>> to Peirce and worried about the use of Saussurean concepts.
>>>>>Words and sentences can indeed provide meanings (as in Frege's Sinn - 
>>>>>"sense" - "intension") and may also provide reference (Frege's Bedeutung - 
>>>>>"reference" - "extension"), but I think that it is important to recognize 
>>>>>that not all signs fit this model.
>>>>> Moreover, 
>>>>>
>>>>>A sneeze is often interpreted as evidence about someone's state of health, 
>>>>>and a clap of thunder may indicate an approaching storm.
>>>>>These can also be interpreted differently by my dog, but it is still 
>>>>>information about something, even though I would not say that they mean 
>>>>>something to that interpreter. Both of these phenomena can be said to 
>>>>>provide reference to something other than
>>>>> that sound itself, but when we use such phrases as "it means you have a 
>>>>> cold" or "that means that a storm is approaching" we are using the term 
>>>>> "means" somewhat metaphorically (most often in place of the more accurate 
>>>>> term "indicates").
>>>>>
>>>>>And it is even more of a stretch to use this term with respect to pictures 
>>>>>or diagrams. 
>>>>>So no one would say the a specific feature like the ears in a caricatured 
>>>>>face mean something.
>>>>>Though if the drawing is employed in a political cartoon e.g. with 
>>>>>exaggerated ears and the whole cartoon is assigned a meaning then perhaps 
>>>>>the exaggeration of this feature may become meaningful. And yet we would 
>>>>>probably agree that every line of the
>>>>> drawing provides information contributing to that meaning.
>>>>>
>>>>>So basically, I am advocating an effort to broaden our discussions and 
>>>>>recognize that the term information applies in diverse ways to many 
>>>>>different contexts. And because of this it is important to indicate the 
>>>>>framing, whether physical, formal, biological,
>>>>> phenomenological, linguistic, etc.
>>>>>For this reason, as I have suggested before, I would love to have a 
>>>>>conversation in which we try to agree about which different uses of the 
>>>>>information concept are appropriate for which contexts. The classic 
>>>>>syntax-semantics-pragmatics distinction introduced
>>>>> by Charles Morris has often been cited in this respect, though it too is 
>>>>> in my opinion too limited to the linguistic paradigm, and may be 
>>>>> misleading when applied more broadly. I have suggested a parallel, less 
>>>>> linguistic (and nested in Stan's subsumption sense)
>>>>> way of making the division: i.e. into intrinsic, referential, and 
>>>>> normative analyses/properties of information. 
>>>>>
>>>>>Thus you can analyze intrinsic  properties of an informing medium [e.g. 
>>>>>Shannon etc etc] irrespective of these other properties, but can't make 
>>>>>sense of referential properties [e.g. what something is about, conveys] 
>>>>>without considering intrinsic sign vehicle properties, and can't deal with 
>>>>>normative
>>>>> properties [e.g. use value, contribution to function, significance, 
>>>>> accuracy, truth] without also considering referential properties [e.g. 
>>>>> what it is about].
>>>>>
>>>>>In this respect, I am also in agreement with those who have pointed out 
>>>>>that whenever we consider referential and normative properties we must 
>>>>>also recognize that these are not intrinsic and are 
>>>>>interpretation-relative. Nevertheless, these are legitimate
>>>>> and not merely subjective or nonscientific properties, just not 
>>>>> physically intrinsic. I am sympathetic with those among us who want to 
>>>>> restrict analysis to intrinsic properties alone, and who defend the 
>>>>> unimpeachable value that we have derived from the formal
>>>>> foundations that Shannon's original analysis initiated, but this should 
>>>>> not be used to deny the legitimacy of attempting to develop a more 
>>>>> general theory of information that also attempts to discover formal 
>>>>> principles underlying these higher level properties
>>>>> implicit in the concept. 
>>>>>
>>>>>I take this to be the intent behind Pedro's list. And I think it would be 
>>>>>worth asking for each of his points: Which information paradigm within 
>>>>>this hoierarchy does it assume?
>>>>>
>>>>>— Terry
>>>>>
>>>>>
>>>>
>>>>
>>>>
>>>>_______________________________________________
>>>>Fis mailing list
>>>>Fis@listas.unizar.es
>>>>http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>>>
>>>
>>>-- 
>>>-------------------------------------------------
>>>Pedro C. Marijuán
>>>Grupo de Bioinformación / Bioinformation Group
>>>Instituto Aragonés de Ciencias de la Salud
>>>Centro de Investigación Biomédica de Aragón (CIBA)
>>>Avda. San Juan Bosco, 13, planta 0
>>>50009 Zaragoza, Spain
>>>Tfno. +34 976 71 3526 (& 6818)
>>>pcmarijuan.i...@aragon.es
>>>http://sites.google.com/site/pedrocmarijuan/
>>>------------------------------------------------- 
>>>_______________________________________________
>>>Fis mailing list
>>>Fis@listas.unizar.es
>>>http://listas.unizar.es/cgi-bin/mailman/listinfo/fis
>>>
>>
>>
>
_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to