Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-13 Thread tozziarturo
Dear Sung, I'm sorry, but the "Unreasonable Effectiveness of Mathematics" still holds true.  Forget philosophical concepts like Yin and Yang, because, in some cases and contexts , entropy is negative.  Just to make an example, "Since the entropy H(S|O) can now become negative, erasing a syste

Re: [Fis] A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-13 Thread Sungchul Ji
Hi Arturo, (1) I don't understand where you got (or how you can justify) S = 1 J/K in your statement, "With the same probability mass function, you can see that H = S/(ln(2)*kB), so setting S = 1J/K gives a Shannon entropy of 1.045×1023 bits." (2) I can see how one can get H = S/(ln(2)*k_B

Re: [Fis] Data - Reflection - Information

2017-10-13 Thread Robert E. Ulanowicz
Dear Mark, Thank you for your interest in my FIS paper! I didn't intend by it to infer that Shannon-class measures were the ultimate tool for information science, only to argue against prematurely rejecting that thrust entirely -- as so many do.

[Fis] R: Re: A PROPOSAL ABOUT THE DEFINITION OF INFORMATION

2017-10-13 Thread tozziart...@libero.it
Dear Sung, One J/K corresponds to 1.045×1023 bits. Indeed, The Gibbs entropy formula states that thermodynamic entropy S equals kB*sum[pi*ln(1/pi)], with units of J/K, where kB is the Boltzmann constant and pi is the probability of microstate i. On the other hand, the Shannon entropy is defined