Note: what follows is an abbreviated text taken from the presentation.
The whole file, too big for our list, can be found at fis web pages:
http://fis.sciforum.net/wp-content/uploads/sites/2/2014/11/Planckian_information.pdf
A very recent article developing similar ideas: http://www.mdpi.com/2078-2489/8/1/24
Greetings to all--Pedro
-------------------------------------------------------------------------------------------------------------------


*What is the Planckian information ?*

*S**UNGCHUL JI*

/Department of Pharmacology and Toxicology
Ernest Mario School of Pharmacy
Rutgers University/
/s...@pharmacy.rutgers.edu/

*
*
The Planckian information (I_P) is defined as the information produced (or used) by the so-called Planckian processes which are in turn defined as any physicochemical or formal processes that generate long-tailed histograms fitting the Planckian Distribution Equation (PDE),

y = (A/(x + B^5)/(Exp(C/(x + B)) – 1)(1)

where A, B and C are free parameters, x is the class or the bin to whichobjects or entities belong, and y is the frequency [1, 1a].The PDE was derived in 2008 [2] from the blackbody radiation equation discovered by M. Planck (1858-1947) in 1900, by replacing the universal constants and temperature with free parameters, A, B and C.PDE has been found to fit not only the blackbody radiation spectra (as it should) but also numerous other long-tailed histograms [3, 4] (see Figure 1).

One possible explanation for the universality of PDE is that many long-tailed histograms are generated by some selection mechanisms acting on randomly/thermally accessible processes [3]. Since random processes obey the Gaussian distribution, the ratio of the area under the curve (AUC) of PDE to that of Gaussian-like symmetric curves can be used as a measure of non-randomness or the order generated by the Planckian processes.

As can be seen in *Figs. 1 (g), (i), (k), (o), (r) *and*(t), *the curves labeled ‘Gaussian’ or ‘Gaussian-like’ overlap with the rising phase of the PDE curves.The ‘Gaussian-like’ curves were generated by Eq. (2), which was derived from the Gaussian equation by replacing its pre-exponential factor with free parameter A:

y = Ae^– (x – ^μ ^)^2/(2 ^σ ^^2) (2)

The degree of mis-match between the area under the curve (AUC) of PDE, Eq. (1), and that of GLE, Eq. (2), is postulated to be a measure of /non-randomness/ (and hence /order/).GLE is associated with random processes, since it is symmetric with respect to the sign reversal of in its exponential term, (x - µ).This /measure of order/ is referred to as the Planckian Information (I_P ) defined quantitatively as shown in Eq. (3) or Eq. (4):

I_P = log_2 (AUC(PDE)/AUC(GLE))bits(3)

or


I_P= log_2 [∫P(x)dx/∫G(x)dx]bits(4)

where P(x) and G(x) are the Plackian Distribution Equation and the Gaussian-Like Equation, respectively.

It is generally accepted that there are at least three basic aspects to information – /amount/, /meaning, /and /value. //Planckian information/ is primarily concerned with the /amount/ (and hence the /quantitative/ aspect) of information.There are numerous ways that have been suggested in the literature for /quantifying information/ bedside the well-known Hartley information, Shannon entropy, algorithmic information, etc [5].The Planckian information, given by Equation (3), is a new measure of information that applies to the /Planckian process/ generally defined as in (5):

“Planckian processes are the physicochemical, neurophysiological, (5)
biomedical, mental, linguistic, socioeconomic, cosmological, or any

other processes that generate long-tailed histograms obeying the
Planckian distribution equation (PDE).”

The Planckian information represents the degree of organization of physical (or nonphysical) systems in contrast to the Boltzmann or the Boltzmann-Gibbs entropy which represents the disorder/disorganization of a physical system, whether the system involved is atoms, enzymes, cells, brains, human societies, or the Universe.I_P is related to the “organized complexity” and S is realted to “disorganized complexity” of Weaver [6].The organization represented by I_P results from /symmetry-breaking selection/ /processes /applied to some randomly accessible (and hence symmetrically distributed) processes, whether the system involved is atoms, enzymes, cells, brains, languages, human societies, or the Universe [3, 4], as schematically depicted in *Figure 2*.

There is a great confusion in science and philosophy concerning the relation between the concepts of /information/ and /entropy/ as pointed out by Wicken [7].A large part of this confusion may be traced back to the suggestions made by Schrödinger in 1944 [8] and others subsequently (e.g., von Neumann, Brillouin, etc.) that /order/ can be measured as the /inverse of/ /disorder/ (D) and hence that information can be measured as negative entropy (see the second column in *Table 1*).

*Table 1.*Two different views on the entropy-information relation.I_P = the Planckian information, Eq. (8.11).D = disorder.AUC = Area Under the Curve; PDE = Planckian Distribution Equation, (1); GLE = Gaussian-like Equation, (2).

        

*Schrödinger (1944) *[8]

        

*Ji (2015) *[1, 3]

Entropy (S)

        

S = k log D

        

S = k log D

Information (I)

        

/- S = k log (1/D)/

        

/I_P = log_2 [AUC(PDE)/AUC(GLE)]/

**

As I pointed out in [9], the concept of “negative entropy” violates the /Third Law of Thermodynamics /and hence cannot be used to define “order” nor “information”.However,Planckian information, I_P, can be positive, zero, or negative, depending on whether AUC(PDE) is greater than, equal to, or less than AUC (GLE), respectively, leading to the conclusion that

“Information can, but entropy cannot, be negative.”(6)

Hence that

“Information is not entropy.”(7)

I recommended in [10] that Statement (6) or (7) be referred to as the *First Law of Informatics* (FLI).It is hoped that FLI will help clarify the decades-long confusions plaguing the fields of informatics, computer science, thermodynamics, biology, and philosophy. **

**Another way of supporting the thesis that /information /and /entropy/ are not equivalent is invokethe notion of /irreducible triadic relations/ (ITR) of Peirce (1839-1914) [11], according to whom the sign (i.e., anything that stands for something other than itself) is irreducible triad of /object/, /representamen/ (also called /sign/) and /interpretant./The irreducible triadic relation (ITR) can be represented as a 3-node network shown in *Figure 3*.The /communication system/ of Shannon is also irreducibly triadic, since it can be mapped to the sign triad as indicated in Figurer 3.Entropy (in the sense of Shannon’s communication theory) is one of the three /nodes/ and Information (in the sense of Peircean semiotics) is one of the three /edges/.Clearly, nodes and edges are two different classes of entities, consistent with FLI, Statement (7).**

**

*Figure 3.*The isomorphism between Shannon’s communication system (/the source-message-receiver triad/) and Peirce’s semiotic system (/the object-sign-interpretant triad/), the “interpretant” being defined as the effect that a sign has on the mind of an interpreter.The arrows read “determines” or “constrains”./f/= sign/message production, g = sign/message interpretation; /h /= information flow, or correspondence. The diagram is postulated to be equivalent to the commutative triangle of the category theory [12], i.e., f x g = h.

**

*References:*
[1] Ji, S. (2015). PLANCKIAN INFORMATION (IP): A NEW MEASURE OF ORDER IN ATOMS, ENZYMES, CELLS, BRAINS, HUMAN SOCIETIES, AND THE COSMOS. <http://www.conformon.net/wp-content/uploads/2016/09/PDE_Vigier9.pdf> In: /Unified Field Mechanics: Natural Science beyond the Veil of Spacetime/ (Amoroso, R., Rowlands, P., and Kauffman, L. eds.), World Scientific, New Jersey, 2015, pp. 579-589).PDF at http://www.conformon.net/wp-content/uploads/2016/09/PDE_Vigier9.pdf [1a] Ji, S. (2016).Planckian Information (I_P): A Measure of the Order in Complex Systems.In: Information and Complexity (M. Burgin and Calude, C. S., eds.), World Scientific, New Jersey. [2] Ji, S. (2012).Molecular Theory of the Living Cell: Concepts, Molecular Mechanisms and Biomedical Applications.Springer, New York.Chapters 11 and 12.PDF at http:/www.conformon.net under Publications > Book Chapters. [3] Ji, S. (2015). Planckian distributions in molecular machines, living cells, and brains: The wave-particle duality in biomedical sciences. /Proceedings of the International Conference on Biology and Biomedical Engineering./ Vienna, March 15-17, pp. 115-137. Retrievable from http://www.inase.org/library/2015/vienna/BICHE.pdf
............

See original file at:
http://fis.sciforum.net/wp-content/uploads/sites/2/2014/11/Planckian_information.pdf


S. Ji, 03/21/2017

--------------------------------------------------

--
-------------------------------------------------
Pedro C. Marijuán
Grupo de Bioinformación / Bioinformation Group
Instituto Aragonés de Ciencias de la Salud
Centro de Investigación Biomédica de Aragón (CIBA)
Avda. San Juan Bosco, 13, planta 0
50009 Zaragoza, Spain
Tfno. +34 976 71 3526 (& 6818)
pcmarijuan.i...@aragon.es
http://sites.google.com/site/pedrocmarijuan/
-------------------------------------------------

_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to