On 1/27/2012 11:47 PM, Evgenii Rudnyi wrote:
On 27.01.2012 23:03 meekerdb said the following:
On 1/27/2012 12:43 PM, Evgenii Rudnyi wrote:
On 27.01.2012 21:22 meekerdb said the following:
On 1/27/2012 11:21 AM, Evgenii Rudnyi wrote:
On 25.01.2012 21:25 meekerdb said the following:
On 1/25/2012 11:47 AM, Evgenii Rudnyi wrote:
...
Let me suggest a very simple case to understand better what
you are saying. Let us consider a string "10" for
simplicity. Let us consider the next cases. I will cite
first the thermodynamic properties of Ag and Al from CODATA
tables (we will need them)

S ° (298.15 K) J K-1 mol-1

Ag cr 42.55 ą 0.20 Al cr 28.30 ą 0.10

In J K-1 cm-3 it will be

Ag cr 42.55/107.87*10.49 = 4.14 Al cr 28.30/26.98*2.7 =
2.83

1) An abstract string "10" as the abstract book above.

2) Let us make now an aluminum plate (a page) with "10"
hammered on it (as on a coin) of the total volume 10 cm^3.
The thermodynamic entropy is then 28.3 J/K.

3) Let us make now a silver plate (a page) with "10"
hammered on it (as on a coin) of the total volume 10 cm^3.
The thermodynamic entropy is then 41.4 J/K.

4) We can easily make another aluminum plate (scaling all
dimensions from 2) to the total volume of 100 cm^3. Then
the thermodynamic entropy is 283 J/K.

Now we have four different combinations to represent a
string "10" and the thermodynamic entropy is different. If
we take the statement literally then the information must
be different in all four cases and defined uniquely as the
thermodynamic entropy is already there. Yet in my view this
makes little sense.

Could you please comment on this four cases?

The thermodynamic entropy is a measure of the information
required to locate the possible states of the plates in the
phase space of atomic configurations constituting them. Note
that the thermodynamic entropy you quote is really the
*change* in entropy per degree at the given temperature. It's
a measure of how much more phase space becomes available to
the atomic states when the internal energy is increased. More
available phase space means more uncertainty of the exact
actual state and hence more information entropy. This
information is enormous compared to the "01" stamped on the
plate, the shape of the plate or any other aspects that we
would normally use to convey information. It would only be in
case we cooled the plate to near absolute zero and then tried
to encode information in its microscopic vibrational states
that the thermodynamic and the encoded information entropy
would become similar.


I would say that from your answer it follows that engineering
information has nothing to do with the thermodynamic entropy.
Don't you agree?

Obviously not since I wrote above that the thermodynamic entropy
is a measure of how much information it would take to locate the
exact state within the phase space allowed by the thermodynamic
parameters.

Does this what engineers use when they develop communication
devices?



It would certainly interesting to consider what happens when
we decrease the temperature (in the limit to zero Kelvin).
According to the Third Law the entropy will be zero then. What
do you think, can we save less information on a copper plate at
low temperatures as compared with higher temperatures? Or
more?

Are you being deliberately obtuse? Information encoded in the
shape of the plate is not accounted for in the thermodynamic
tables - they are just based on ideal bulk material (ignoring
boundaries).

I am just trying to understand the meaning of the term information
 that you use. I would say that there is the thermodynamic entropy
and then the Shannon information entropy. The Shannon has developed
a theory to help engineers to deal with communication (I believe
that you have also recently a similar statement). Yet, in my view
when we talk about communication devices and mechatronics, the
information that engineers are interested in has nothing to do with
the thermodynamic entropy. Do you agree or disagree with that? If
you disagree, could you please give an example from engineering
where engineers do employ the thermodynamic entropy as the estimate
of information.

I already said I disagreed. You are confusing two different things.
Because structural engineers don't employ the theory of interatomic
forces it doesn't follow that interactomic forces have nothing to do
 with sturctural properties.

Brent

You disagree that engineers do not use thermodynamic entropy


Yes. I disagreed that information "has nothing to do with thermodynamic entropy", as you wrote above. You keep switching formulations. You write X and ask if I agree. I disagree. Then you claim I've disagreed with Y. Please pay attention to your own writing. There's a difference between "X is used in place of Y" and "X has nothing to do with Y".

but you have not shown yet how information in engineering is related with the thermodynamic entropy. Form the Millipede example

>> http://en.wikipedia.org/wiki/Millipede_memory

"The earliest generation millipede devices used probes 10 nanometers in diameter and 70 nanometers in length, producing pits about 40 nm in diameter on fields 92 µm x 92 µm. Arranged in a 32 x 32 grid, the resulting 3 mm x 3 mm chip stores 500 megabits of data or 62.5 MB, resulting in an areal density, the number of bits per square inch, on the order of 200 Gbit/in²."

If would be much easier to understand you if you say to what thermodynamic entropy corresponds the value of 62.5 MB in Millipede.


The Shannon information capacity is 5e8 bits. The thermodynamic entropy depends on the energy used to switch a memory element. I'd guess it must correspond to at least few tens of thousands of electrons at 9v, so

            S ~ [5e8 * 9e4 eV]/[8.6e-5 eV/degK * 300degK]~17e15

So the total entropy is about 17e15+5e8, and the information portion is numerically (but not functionally) negligible compared to the thermodynamic.

Brent


The only example on Thermodynamic Entropy == Information so far from you was the work on a black hole. However, as far as I know, there is no theory yet to describe a black hole, as from one side you need gravitation, from the other side quantum effects. The theory that unites them seems not to exist.

Evgenii


My example would be Millipede

http://en.wikipedia.org/wiki/Millipede_memory

I am pretty sure that when IBM engineers develop it, they do not
employ the thermodynamic entropy to estimate its information
capabilities. Also, the increase of temperature would be destroy
saved information there.

Well, I might be deliberately obtuse indeed. Yet with the only goal
to reach a clear definition of what the information is. Right now I
would say that there is information in engineering and in physics
and they are different. The first I roughly understand and the
second not.

Evgenii



Brent





--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To post to this group, send email to everything-list@googlegroups.com.
To unsubscribe from this group, send email to 
everything-list+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/everything-list?hl=en.

Reply via email to