Re: [Fis] Are there 3 kinds of motions in physics and biology?

2018-05-07 Thread Michel Petitjean
Dear Karl,
Yes I can hear you.
About symmetry, I shall soon send you an explaining email, privately,
because I do not want to bother the FISers with long explanations
(unless I am required to do it).
However, I confess that many posts that I receive from the FIS list
are very hard to read, and often I do not understand their deep
content :)
In fact, that should not be shocking: few people are able to read
texts from very diverse fields (as it occurs in the FIS forum), and I
am not one of them.
Even the post of Sung was unclear for me, and it is exactly why I
asked him questions, but only on the points that I may have a chance
to understand (may be).
Best regards,
Michel.

2018-05-07 17:55 GMT+02:00 Karl Javorszky :
> Dear Michel and Sung,
>
> Your discussion is way above my head in the jargon and background knowledge.
> Please bear with me while a non-mathematician tries to express some 
> observations that regard symmetry.
>
> Two almost symmetrical spaces appear as Gestalts, expressed by numbers, if 
> one orders and reorders the expression a+b=c. One uses natural numbers – in 
> the range of 1..16 – to create a demo collection, which one then sorts and 
> re-sorts ad libitum / ad nauseam. The setup of the whole exercise does not 
> take longer than 1, max 2 hours. Then one can observe patterns.
>
> The patterns here specifically referred to are two – almost – symmetrical 
> rectangular, orthogonal spaces. As these patterns are derived from simple 
> sorting operations on natural numbers, one can well argue that they represent 
> fundamental pictures.
>
> The generating algorithm is 5 lines of code. Here it is.
>
> #d=16
>
> begin outer loop, i:1,d
> begin inner loop, j:i,d
> append new record
> write
>  a=i, b=j, c=a+b, k=b-2a, u=b-a, t=2b-3a,
> q=a-2b, s=(d+1)-(a+b), w=2a-3b
> end inner loop
> end outer loop
>

> The next step is to sequence (sort, order) the rows. We use 2 sorting 
> criteria: as first, any one of {a,b,c,k,u,t,q,s,w}, and as 2nd sorting 
> criterium any of the remaining 8. This makes each of the 9 aspects of a+b=c 
> to be once a first, and once a second sorting key. We register the linear 
> sequential number of each element in a column for each of the 72 catalogued 
> sorting orders..
>
> Do you think the idea of symmetry is somehow connected to some very basic 
> truths of logic? Then maybe the small effort to create a database with 136 
> rows and 9+72 columns is possible.
>
> The trick begins with the next step:
>
> We go through the 72 sorting orders and re-sort from each of them into all 
> and each of the remaining 71. We register the sequential place of the element 
> in the order αβ while being resorted into order γδ. This gives each element a 
> value (a linear place, 1..136) “from” and a value “to”. The element is given 
> the attributes: Element: a,b, “Old Order”: αβ, from place nr i, “New Order” 
> γδ, to place nr. j. While doing this, one will realise, that reorganisations 
> happen by means of cycles, and will add attributes :
> Cycle nr: k, Within cycle step nr:. l. This is simple counting and using 
> logical flags.
>
> The cycles, that we have now arrived at, give a very useful skeleton for any 
> and all theories about order. You will find the two Euclid-type spaces by 
> filtering out those reorganisations that consist of 46 cycles, of which 45 
> have 3 elements in their corpus, where each of the 45 cycles has Σa=18, Σb=33.
>
> The two rectangular spaces – created by paths of elements during resorting – 
> are not quite symmetrical. As an outsider, I’d believe that there is 
> something to awake the natural curiosity of mathematicians.
>
> Hoping to have caught your interest.
>
> Karl
>

___
Fis mailing list
Fis@listas.unizar.es
http://listas.unizar.es/cgi-bin/mailman/listinfo/fis


Re: [Fis] Are there 3 kinds of motions in physics and biology?

2018-05-07 Thread Karl Javorszky
Dear Michel and Sung,



Your discussion is way above my head in the jargon and background
knowledge. Please bear with me while a non-mathematician tries to express
some observations that regard symmetry.



Two almost symmetrical spaces appear as Gestalts, expressed by numbers, if
one orders and reorders the expression *a+b=c. *One uses natural numbers –
in the range of 1..16 – to create a demo collection, which one then sorts
and re-sorts ad libitum / ad nauseam. The setup of the whole exercise does
not take longer than 1, max 2 hours. Then one can observe patterns.



The patterns here specifically referred to are two – almost – symmetrical
rectangular, orthogonal spaces. As these patterns are derived from simple
sorting operations on natural numbers, one can well argue that they
represent fundamental pictures.



The generating algorithm is 5 lines of code. Here it is.



*#d=16*





*begin outer loop, i:1,d*







*begin inner loop, j:i,d*







*append new record*

*write*

* a=i, b=j, c=a+b, k=b-2a, u=b-a, t=2b-3a,*

*q=a-2b, s=(d+1)-(a+b), w=2a-3b*



*end inner loop*



*end outer loop*







The next step is to *sequence* (sort, order) the rows. We use 2 sorting
criteria: as first, any one of {a,b,c,k,u,t,q,s,w}, and as 2nd sorting
criterium any of the remaining 8. This makes each of the 9 aspects of
*a+b=c* to be once a first, and once a second sorting key. We register the
linear sequential number of each element in a column for each of the 72
catalogued sorting orders..

Do you think the idea of symmetry is somehow connected to some very basic
truths of logic? Then maybe the small effort to create a database with 136
rows and 9+72 columns is possible.



The trick begins with the next step:

We go through the 72 sorting orders and re-sort from each of them into all
and each of the remaining 71. We register the sequential place of the
element in the order αβ while being resorted into order γδ. This gives each
element a value (a linear place, 1..136) “from” and a value “to”. The
element is given the attributes: Element: *a,b, *“Old Order”: αβ, from
place nr *i*, “New Order” γδ, to place nr. j. While doing this, one will
realise, that reorganisations happen by means of *cycles, *and will add
attributes : Cycle nr: *k, *Within cycle step nr:. *l.* This is simple
counting and using logical flags.



The cycles, that we have now arrived at, give a very useful skeleton for
any and all theories about order. You will find the two Euclid-type spaces
by filtering out those reorganisations that consist of 46 cycles, of which
45 have 3 elements in their corpus, where each of the 45 cycles has
Σa=18, Σb=33.




The two rectangular spaces – created by paths of elements during resorting
– are not quite symmetrical. As an outsider, I’d believe that there is
something to awake the natural curiosity of mathematicians.



Hoping to have caught your interest.



Karl


2018-05-07 15:06 GMT+02:00 Michel Petitjean :

> Dear Sung,
>
> The formula of the Planckian information in Table 1 is intriguing.
> The argument of the log_2 function was proposed in 1895 by Karl Pearson as
> a measure of asymmetry of a distribution (see [1], p. 370).
> In general the mean can be smaller than the mode (so the log cannot
> exist), but I assume that in your context that cannot happen.
> Also, I assume that this context excludes distributions such as a mixture
> of two well separated unit variance Gaussian laws, for which the mean is
> located at an antimode, and not at a mode.
>
> The skewness, which is also used as an asymmetry coefficient, is the
> reduced third order centered moment (may be positive or negative).
> The square of this latter quantity was also introduced by Karl Pearson as
> a measure of asymmetry of a distribution (see [1], p. 351).
>
> So, all these quantities are used as asymmetry measures.
>
> Two questions arise:
> 1. Has the Planckian information some relations with symmetry or asymmetry?
> If yes, which ones?
> That would not be shocking: Shu-Kun Lin (refs [2,3]) discussed about
> relations between information and symmetry.
> 2. The asymmetry measures above have a major drawback: a null value can be
> observed for some families of asymmetric distributions, and not only for
> symmetric distributions.
>
> In the case you indeed need to consider the log of a non negative quantity
> measuring the asymmetry of a distribution, which vanishes if and only if
> the distribution is symmetric, you may consider the chiral index \chi
> (section 2.9, ref [4]).
> \chi index takes values in [0..1] (in fact, in [0..1/2]) for univariate
> probability distributions, and it is null if and only if the distribution
> is symmetric.
> It has other properties, but that falls out of the scope of this
> discussion.
> Then, simply replace [ (\mu-mode) / \sigma ] by \chi as the argument of
> log_2.
>
> [1] Pearson, K.
> Contributions to the Mathematical Theory of Evolution,-II. Skew Variation
> in 

Re: [Fis] Are there 3 kinds of motions in physics and biology?

2018-05-07 Thread Michel Petitjean
Dear Sung,

The formula of the Planckian information in Table 1 is intriguing.
The argument of the log_2 function was proposed in 1895 by Karl Pearson as
a measure of asymmetry of a distribution (see [1], p. 370).
In general the mean can be smaller than the mode (so the log cannot exist),
but I assume that in your context that cannot happen.
Also, I assume that this context excludes distributions such as a mixture
of two well separated unit variance Gaussian laws, for which the mean is
located at an antimode, and not at a mode.

The skewness, which is also used as an asymmetry coefficient, is the
reduced third order centered moment (may be positive or negative).
The square of this latter quantity was also introduced by Karl Pearson as a
measure of asymmetry of a distribution (see [1], p. 351).

So, all these quantities are used as asymmetry measures.

Two questions arise:
1. Has the Planckian information some relations with symmetry or asymmetry?
If yes, which ones?
That would not be shocking: Shu-Kun Lin (refs [2,3]) discussed about
relations between information and symmetry.
2. The asymmetry measures above have a major drawback: a null value can be
observed for some families of asymmetric distributions, and not only for
symmetric distributions.

In the case you indeed need to consider the log of a non negative quantity
measuring the asymmetry of a distribution, which vanishes if and only if
the distribution is symmetric, you may consider the chiral index \chi
(section 2.9, ref [4]).
\chi index takes values in [0..1] (in fact, in [0..1/2]) for univariate
probability distributions, and it is null if and only if the distribution
is symmetric.
It has other properties, but that falls out of the scope of this discussion.
Then, simply replace [ (\mu-mode) / \sigma ] by \chi as the argument of
log_2.

[1] Pearson, K.
Contributions to the Mathematical Theory of Evolution,-II. Skew Variation
in Homogeneous Material.
Phil. Trans. Roy. Soc. London (A.), 1895, 186, 343-414.

[2] Lin, S.K.
Correlation of Entropy with Similarity and Symmetry.
J. Chem. Inf. Comput. Sci. 1996, 36, 367--376

[3] Lin, S.K.
The Nature of the Chemical Process. 1. Symmetry Evolution –Revised
Information Theory, Similarity Principle and Ugly Symmetry.
Int. J. Mol. Sci. 2001, 2, 10--39
(available in open access)

[4] Petitjean, M.
Chirality and Symmetry Measures: A Transdisciplinary Review.
Entropy, 2003, 5[3], 271--312.
(available in open access)

Best regards,

Michel.

Michel Petitjean
MTi, INSERM UMR-S 973, University Paris 7,
CNRS SNC 9079
35 rue Helene Brion, 75205 Paris Cedex 13, France.
Phone: +331 5727 8434; Fax: +331 5727 8372
E-mail: petitjean.chi...@gmail.com (preferred),
michel.petitj...@univ-paris-diderot.fr
http://petitjeanmichel.free.fr/itoweb.petitjean.symmetry.html


2018-05-07 4:08 GMT+02:00 Sungchul Ji :

> Hi FISers,
>
> I think information and energy are inseparable in reality.  Hence to
> understand what information is, it may be helpful to understand what energy
> (and the associated concept of motion) is.  In this spirit, I am forwarding
> the following email that I wrote motivated by the lecture given by Dr.
> Grossberg this afternoon at the 119th Statistical Mechanics Conference.  In 
> *Table
> 1* in the email, I divided particle motions studied in physics and
> biology into three classes -- (i) *random*, (ii) *passive*, and (iii)
> *active*, and identified the field of specialization wherein these
> motions are studied as (i) *statistical mechanics*, (ii) *stochastic
> mechanics*, and (iii) *info-statistical mechanics*.  The last term was
> coined by me in 2012  in [1].  I will be presenting a short talk (5
> minutes) on* Info-statistical mechanics* on Wednesday, May 9, at the
> above meeting.   The abstract of the short talk is given below:
>
> Short talk to be presented at the *119th Statistical Mechanics Conference*,
> Rutgers University, Piscataway, N.J., May 6-9, 2018).
>
>
>
> *Planckian Information** may be to Info-Statistical Mechanics what
> Boltzmann Entropy is to Statistical Mechanics. *
>
> Sungchul Ji, Department of Pharmacology and Toxicology, Ernest Mario
> School of Pharmacy, Rutgers University, Piscataway, N.J. 08854
>
> Traditionally, the dynamics of any N-particle systems in statistical
> mechanics is completely described in terms of the 6-dimensional *phase
> space* consisting of the 3N positional coordinates and 3N momenta, where
> N is the number of particles in the system [1]. Unlike the particles dealt
> with in statistical mechanics which are featureless and shapeless, the
> particles in biology have characteristic shapes and internal structures
> that determine their biological properties.  The particles in physics are
> completely described in terms of energy and matter in the phase space but
> the description of the particles in living systems require not only the
> energy and matter of the particle but also their genetic information,
> consistent with 

[Fis] information and energy are separable

2018-05-07 Thread Krassimir Markov
Dear Sung and Francesco,

Information and Energy are not only separable but quite different.

More, the Energy exists without Information.
To create and process Information, Energy is needed.

Please see the next publication with example just from the economics   

ENERGY VERSUS INFORMATION
Krassimir Markov
page 122-125 in:
http://foibg.com/ibs_isc/ibs-31/IBS_ISC-No31-KDS2014.pdf

Friendly greetings
Krassimir





From: Francesco Rizzo 
Sent: Monday, May 07, 2018 10:42 AM
To: Sungchul Ji 
Cc: FIS FIS ; sji.confor...@gmail.com 
Subject: Re: [Fis] Are there 3 kinds of motions in physics and biology?

Caro Sung e cari tutti, 

"I think information and energy are inseparable in reality": è vero anche in 
economia. 


La Parte Terza--Teoria del valore: energia e informazione--  di "Valore e 
valutazioni. La scienza dell'economia o l'economia della scienza" 
(FrancoAngeli, Milano, 1995-1999) è costituita dalle pagine 451-646  contenenti 
questa interessante e significativa problematica. 

Grazie e auguri.

Francesco

2018-05-07 4:08 GMT+02:00 Sungchul Ji :


  Hi FISers,

  I think information and energy are inseparable in reality.  Hence to 
understand what information is, it may be helpful to understand what energy 
(and the associated concept of motion) is.  In this spirit, I am forwarding the 
following email that I wrote motivated by the lecture given by Dr. Grossberg 
this afternoon at the 119th Statistical Mechanics Conference.  In Table 1 in 
the email, I divided particle motions studied in physics and biology into three 
classes -- (i) random, (ii) passive, and (iii) active, and identified the field 
of specialization wherein these motions are studied as (i) statistical 
mechanics, (ii) stochastic mechanics, and (iii) info-statistical mechanics.  
The last term was coined by me in 2012  in [1].  I will be presenting a short 
talk (5 minutes) on Info-statistical mechanics on Wednesday, May 9, at the 
above meeting.   The abstract of the short talk is given below:

  Short talk to be presented at the 119th Statistical Mechanics Conference, 
Rutgers University, Piscataway, N.J., May 6-9, 2018).



  Planckian Information may be to Info-Statistical Mechanics what Boltzmann 
Entropy is to Statistical Mechanics. 

  Sungchul Ji, Department of Pharmacology and Toxicology, Ernest Mario School 
of Pharmacy, Rutgers University, Piscataway, N.J. 08854

  Traditionally, the dynamics of any N-particle systems in statistical 
mechanics is completely described in terms of the 6-dimensional phase space 
consisting of the 3N positional coordinates and 3N momenta, where N is the 
number of particles in the system [1]. Unlike the particles dealt with in 
statistical mechanics which are featureless and shapeless, the particles in 
biology have characteristic shapes and internal structures that determine their 
biological properties.  The particles in physics are completely described in 
terms of energy and matter in the phase space but the description of the 
particles in living systems require not only the energy and matter of the 
particle but also their genetic information, consistent with the 
information-energy complementarity (or gnergy) postulate discussed in [2, 
Section 2.3.2].  Thus, it seems necessary to expand the dimensionality of the 
traditional phase space to accommodate the information dimension, which 
includes the three coordinates encoding the amount (in bits), meaning (e.g., 
recognizability), and value (e.g., practical effects) of information [2, 
Section 4.3]. Similar views were expressed by Bellomo et al. [3] and Mamontov 
et al. [4].  The expanded “phase space” would comprise the 6N phase space of 
traditional statistical mechanics plus the 3N information space entailed by 
molecular biology.  The new space (to be called the “gnergy space”) composed of 
these two subspaces would have 9N dimensions as indicated in Eq. (1).  This 
equation also makes contact with the concepts of  synchronic and diachronic 
informations discussed in [2, Section 4.5].  It was suggested therein that the 
traditional 6N-dimensional phase space deals with  the synchronic information 
and hence was referred to as the Synchronic Space while the 3N-dimensional 
information space is concerned with the consequences of history and evolution 
encoded in each particle and thus was referred to as the Diachronic Space.  The 
resulting space was called the gnergy space (since it encodes not only energy 
but also information).  



 Gnergy Space =  6N-D Phase Space  +  3N-D  Information Space   
 (1)

  (Synchronic Space)   
(Diachronic Space) 



  The study of both energy and information was defined as “info-statistical 
mechanics” in 2012 [2, pp. 102-106, 297-301].  The Planckian information of the 
second kind, IPS, [5] was defined as the negative of the binary logarithm of 
the skewness of the long-tailed histogram that fits 

Re: [Fis] Are there 3 kinds of motions in physics and biology?

2018-05-07 Thread Francesco Rizzo
Caro Sung e cari tutti,

"I think information and energy are inseparable in reality": è vero anche
in economia.

La Parte Terza--Teoria del valore: energia e informazione--  di "Valore e
valutazioni. La scienza dell'economia o l'economia della scienza"
(FrancoAngeli, Milano, 1995-1999) è costituita dalle pagine 451-646
contenenti questa interessante e significativa problematica.

Grazie e auguri.
Francesco

2018-05-07 4:08 GMT+02:00 Sungchul Ji :

> Hi FISers,
>
> I think information and energy are inseparable in reality.  Hence to
> understand what information is, it may be helpful to understand what energy
> (and the associated concept of motion) is.  In this spirit, I am forwarding
> the following email that I wrote motivated by the lecture given by Dr.
> Grossberg this afternoon at the 119th Statistical Mechanics Conference.  In 
> *Table
> 1* in the email, I divided particle motions studied in physics and
> biology into three classes -- (i) *random*, (ii) *passive*, and (iii)
> *active*, and identified the field of specialization wherein these
> motions are studied as (i) *statistical mechanics*, (ii) *stochastic
> mechanics*, and (iii) *info-statistical mechanics*.  The last term was
> coined by me in 2012  in [1].  I will be presenting a short talk (5
> minutes) on* Info-statistical mechanics* on Wednesday, May 9, at the
> above meeting.   The abstract of the short talk is given below:
>
> Short talk to be presented at the *119th Statistical Mechanics Conference*,
> Rutgers University, Piscataway, N.J., May 6-9, 2018).
>
>
>
> *Planckian Information** may be to Info-Statistical Mechanics what
> Boltzmann Entropy is to Statistical Mechanics. *
>
> Sungchul Ji, Department of Pharmacology and Toxicology, Ernest Mario
> School of Pharmacy, Rutgers University, Piscataway, N.J. 08854
>
> Traditionally, the dynamics of any N-particle systems in statistical
> mechanics is completely described in terms of the 6-dimensional *phase
> space* consisting of the 3N positional coordinates and 3N momenta, where
> N is the number of particles in the system [1]. Unlike the particles dealt
> with in statistical mechanics which are featureless and shapeless, the
> particles in biology have characteristic shapes and internal structures
> that determine their biological properties.  The particles in physics are
> completely described in terms of energy and matter in the phase space but
> the description of the particles in living systems require not only the
> energy and matter of the particle but also their genetic information,
> consistent with the information-energy complementarity (or gnergy)
> postulate discussed in [2, Section 2.3.2].  Thus, it seems necessary to
> expand the dimensionality of the traditional phase space to accommodate the 
> *information
> *dimension, which includes the three coordinates encoding the *amount *(in
> bits), *meaning* (e.g., recognizability), and *value* (e.g., practical
> effects) of information [2, Section 4.3]. Similar views were expressed by
> Bellomo et al. [3] and Mamontov et al. [4].  The expanded “phase space”
> would comprise the 6N phase space of traditional statistical mechanics plus
> the 3N information space entailed by molecular biology.  The new space
> (to be called the “gnergy space”) composed of these two subspaces would
> have 9N dimensions as indicated in Eq. (1).  This equation also makes
> contact with the concepts of  *synchronic* and *diachronic* informations
> discussed in [2, Section 4.5].  It was suggested therein that the
> traditional 6N-dimensional phase space deals with  the *synchronic
> information* and hence was referred to as the *Synchronic Space* while
> the 3N-dimensional information space is concerned with the consequences of
> history and evolution encoded in each particle and thus was referred to as
> the *Diachronic Space*.  The resulting space was called the *gnergy space*
> (since it encodes not only *energy* but also *information*).
>
>
>
>*Gnergy Space* =  *6N-D Phase Space*  +  *3N-D  Information
> Space*(1)
>
> (*Synchronic Space*)   
> (*Diachronic
> Space*)
>
>
>
> The study of both *energy* and *information* was defined as
> “info-statistical mechanics” in 2012 [2, pp. 102-106, 297-301].  The
> Planckian information of the second kind, IPS, [5] was defined as the
> negative of the binary logarithm of the skewness of the long-tailed
> histogram that fits the Planckian Distribution Equation (PDE) [6].   In *Table
> 1*, the Planckian information is compared to the Boltzmann entropy in the
> context of the complexity theory of Weaver [8]. The inseparable relation
> between *energy *and *information* that underlies “info-statistical
> mechanics” may be expressed by the following aphorism:
>
>
> *“Information without energy is useless; Energy without information is
> valueless.”*
>
>
>
> *Table 1.*  A comparison between Planckian Information