Dear FIS colleagues,

 

As usual, I would like to begin with apologies. I apologize that because of
the gaps in my education I can only partially understand what is being said
in most of your mails. Therefore, I will only partially respond to those
segments of your posts that seem to me to be in the limits of my
understanding. 

 

To Karl Javorszky: 

At March 24, you wrote: "I have given in my work "Natural orders - de
ordinibus naturalibus" (ISBN  9783990571378) the following definition of the
term "information": Information is a description of what is not the case". 

I do not know "what is not the case", but I salute and welcome your
statement that "Information is a description." I am also using (for a quite
a long time now) a similar definition: "Information is a description of
structures observable in a given data set". 

By saying this, I do not pretend to claim for priority or credits - all
credits must be directed to A. Kolmogorov who in his 1965 paper "Three
approaches to the quantitative definition of information" was the first who
has introduced the concept.

As all the other researchers of his time, Kolmogorov has developed his
information quantity measure for a linear one-dimensional communication
message data set. I have expanded and extended Kolmogorov's definition to a
two-dimensional data set. In a two-dimensional data set two types of
structures could be distinguished: primary (basic) data structures and
secondary (meaningful structures of structures) data arrangements. According
to the offered definition the descriptions of the discerned structures
should be called - Physical and Semantic Information. Further details on the
subject could be found in my publications on the Research Gate
(https://www.researchgate.net/profile/Emanuel_Diamant) or on my site
(http://www.vidia-mant.info). 

 

To Sungchul Ji (introduced in Pedro C. Marijuan's post from March 23, 2017):


>From your presentation "Planckian information: a new measure of order" I was
pleased to learn something new about Planckian information - a newborn kind
of information. Although you are not familiar with the notion of information
as a complex two-part entity (Physical and Semantic information
subdivisions), you truthfully posit Planckian information as a physical
information exemplar similar to other representatives of the class such as
Shannon information, Fisher, Kolmogorov, Chaitin, and other. 

In your words: "The Planckian information represents the degree of
organization of physical (or nonphysical) systems.", "Planckian information
is primarily concerned with the amount (and hence the quantitative aspect)
of information.  There are numerous ways that have been suggested in the
literature for quantifying information bedside the well-known Hartley
information, Shannon entropy, algorithmic information, etc. " (That is,
Planckian information is one of them (one of the physical information
manifestations), not a foe, not a competitor, not a foreigner or an
outsider).

 

It has to be mentioned that such an approach is not predominant in FIS
discussions. The mainstream way of thinking looks like this: "complaining
about Shannon entropy as a measure of information is completely justified
because it is steam-engine physics unfortunately still widely used despite
its many flaws and limitations"; and further "Shannon entropy should not
even be mentioned any longer in serious discussions about information"
(http://listas.unizar.es/pipermail/fis/2016-June/001039.html). And finally:
"(there is an) urgent need to move away from entropy towards algorithmic
information" (http://sciforum.net/conference/IS4SI-2017/isis-ICPI%202017). 

 

I hope these unfriendly winds will not make an impression on you. I wish you
a speedy and a comfortable accommodation in the FIS community. 

 

Best regards,

Emanuel Diamant.

 

_______________________________________________
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis

Reply via email to