(If the table below is distorted, please see the PDF file attached.)
Hi,
I am in the middle of finishing a paper in which I am proposing that what I
call the "Planckian information", I_P, measured in bits, is a new measure
of organization or order in atoms, enzymes, cells, brains, human societies,
and the cosmos. This proposal is based on our recent findings that the
so-called Planckian distribution equation (see Footnote *** in Table 1
below) fits long-tailed histograms generated in atomic physics, molecular
biology, cell biology, brain neuroscience, econophysics, and cosmic
microwave background radiation physics.
*Table 1.* A unified theory of the *amount of information (UTAI) *carried
by a sign:
I = A log (B/C)
where A = proportionality constant, B = the number of possible messages
available at the message source, and C = the number of messages selected.
Symbol
Name
A
B
C
Statistical mechanics
S
entropy,
Boltzmann entropy
k
Number of possible complexions*
Number of selected complexions*
Communication theory
H
entropy,
Shannon information
-K
1
P**
Natural and human sciences
IP
Planckian information [1]
1
AUC(PDE)***
AUC(GLE)***
*Understood here as the number of possible states at the microscopic (or
micro) level of a system.
**The probability of a message being selected.
***AUC = area under the curve of the Planckian distribution equation (PDE),
y = (a/(Ax + B)^5)/(Exp (b/(Ax + B) – 1), or the Gaussian-like equation
(GLE), y = A Exp (-(x – mu)^2/(2*sigma^2)), where A is a free parameter. IP
is thought to be a new measure of organization or order.
If the content of Table 1 is right, we can conclude with reasonable
confidence that
(1) Statistical entropy S and Shannon entropy H can be viewed as
instantiations or TOKENS of the more abstract definition of information
given in the legend to Table 1 called the "Unified Theory of the Amount of
Information" (UTAI), which may be viewed as the information TYPE.
(2) Although both H and S share the same name "entropy", their meanings
are not the same in that, e.g., S in isolated systems increase with time
and temperature but H does not. In other words, S obeys the Second Law of
thermodynamics but H does not. This is demonstrated in the thought
experiment called the "Bible test" [2, see Footnote c in Table 4.3].
(3) Information can be thought of as resulting from a selection process
characterized by a ratio, B/C, where B = the number of all possible
choices, and C = the number of choices actually selected.
(4) Many have suggested that information has three distinct aspects --
quantity, meaning, and value. UTAI can only deal with the AMOUNT of
information, not its meaning nor its value.
(5) There are many kinds of information just as there are many kinds of
energies (chemical, electrical, gravitational, kinetic, potential, nuclear,
solar, electromagnetic, etc.). Hence we can speak about Boltzmann' S as
"molecular information", Shannon's H as "probability-dependent information
(?)", and I_P as the Planckian information. The meanings of these kinds of
information would depend critically on the detailed mechanism of selection
operating at the message source level.
(6) More generally "information" can be defined as the correlation between
the source (or the 'object' in the language of Peircean semiotics) and the
receiver ('interpretant') of a communication system. The message carried
by the messenger ('sign' or 'representamen') in the communication system
can be identified with "information". The net result of such a mediated
process can be described as the 'information flow' from the source to the
receiver.
(7) Just as the Peircean sign is an irreducible triad (i.e., it cannot be
defined without all of the 3 nodes, i.e., object, representamen and
interpretant, connected by the three edges representing, I suggest,
'natural process', 'mental process', and 'information flow'), so I
maintain that 'information' is another "irreducible triad" (of source,
messenger, and receiver).
(8) The UTAI may be considered as the 'quantitative functor' connecting the
mathematical aspects of communication and semiotics.
(9) I predict that there is the 'qualitative functor' (based on the assumed
principle of quantity-quality complementarity) that connects the
qualitative aspects of communication and semiotics, and this qualitative
functor may be identified with natural and formal languages.
Any questions, comments, or corrections would be appreciated.
All the best.
Sung
Reference:
[1] Ji, S. (2015). Planckian distributions in molecular machines,
living cells: The wave-particle duality in biomedical sciences. *Proceedings
of the International Confernec eon Biology and Biomedical Engineering*.
Vienna, March 15-17, 2015. Pp. 115-137. Uploaded to ResearchGate in
March, 2015.
[2] Ji, S. (2012). The Information-Entropy Relations. In, Molecular
Theory of the Living Cell: Concepts, Molecular Mechanisms, and Biomedical
Applications. Springer, New York. Pp. 97-101.
Sungchul Ji, Ph.D.
Associate Professor of Pharmacology and Toxicology
Department of Pharmacology and Toxicology
Ernest Mario School of Pharmacy
Rutgers University
Piscataway, N.J. 08855
732-445-4701
www.conformon.net
-----------------------------
PEIRCE-L subscribers: Click on "Reply List" or "Reply All" to REPLY ON PEIRCE-L
to this message. PEIRCE-L posts should go to [email protected] . To
UNSUBSCRIBE, send a message not to PEIRCE-L but to [email protected] with the
line "UNSubscribe PEIRCE-L" in the BODY of the message. More at
http://www.cspeirce.com/peirce-l/peirce-l.htm .