On Sun, Dec 21, 2014 at 2:52 PM, Jim Bromer <[email protected]> wrote:
> But again, the 'entropy' is relevant to some 'frames of reference' so
> your last statement must be an over-generalization.

That is a reasonable objection. Entropy depends on the probability
distribution, and probability is just a belief by an observer. In
general, we do not know what the probability is. Even if we assume a
universal distribution, there is still a language dependent constant.

But this does not matter because the proof that entropy decreases
during computation is true for all probability distributions. Entropy
has a precise definition: H = -SUM_i p_i log p_i, where p_i is the
probability of the i'th possible outcome. For a machine with 2 states
and program {A -> B, B -> B}, the initial entropy is -P(A) log P(A) -
P(B) log P(B). This is a positive number for all 0 < P(A), P(B) < 1.
The next state is always B, so P(A) = 0, P(B) = 1, and the entropy is
0. (We take 0 log 0 = lim{x -> 0+} x log x = 0).

More generally, whenever two states have a common next state like {A
-> C, B -> C} and both initial states have probabilities greater than
0, the entropy decreases. I think you can see that -P(A) log P(A) -
P(B) log P(B) > -P(C) log P(C), where P(C) = P(A) + P(B) for all P(A),
P(B) > 0, P(C) <= 1.

This is not a conflict with the second law of thermodynamics, an
observation that entropy always increases in a closed system. In
thermodynamics, entropy is a measure (in bits) of what you don't know
about the state of a system. For example, if you want to describe the
exact state of a box filled with a gas, you would give the position
and velocity of every atom in 3 dimensions. In classical mechanics,
these would be real numbers with infinite precision, so it would only
be possible to talk about relative changes in entropy as temperature,
pressure, volume, or the number of atoms change. For example, if all
the atoms were on one side of the room and separated by a barrier with
a small hole, then we would observe the entropy increase by one bit
per atom as the gas leaked to the other side because you would need
one extra bit per atom to describe its position.

However, quantum mechanics limits the precision of any measurement you
make to an integral multiple of Planck's reduced constant,1.0545717 x
10^-34 Joule-seconds. Therefore, entropy is an absolute quantity. It
is the number of bits that would have to be communicated to you to
describe the state of a system as precisely as quantum mechanics would
allow you to measure it.

Quantum mechanics does not allow a closed system to be observed from
the outside because any observation affects the thing being observed.
Therefore, any observer must be inside the system. A computer with
input and memory is an observer. When you (a computer) observe an
increase in entropy, it means you know less about the system of which
you are a part. For example, you can no longer say that a particular
atom is on one side of the room.

A second observer, observing your own decrease in entropy, would have
to observe an equal or greater increase in entropy elsewhere in the
system, in keeping with the second law of thermodynamics. Irreversible
computations such as writing into memory require free energy. To be
precise, it requires kT ln 2 = 9.57 x 10^-24 Joules per Kelvin per
bit.

-- 
-- Matt Mahoney, [email protected]


-------------------------------------------
AGI
Archives: https://www.listbox.com/member/archive/303/=now
RSS Feed: https://www.listbox.com/member/archive/rss/303/21088071-f452e424
Modify Your Subscription: 
https://www.listbox.com/member/?member_id=21088071&id_secret=21088071-58d57657
Powered by Listbox: http://www.listbox.com

Reply via email to