On 4/14/2015 9:39 AM, Telmo Menezes wrote:
There is, however, an interesting parallel to be made with Shannon's entropy, which is a measure of information content and not just a statistical effect. Once in the realm of digital physics, it becomes questionable if physical entropy and information entropy are separate things.

Shannon's entropy is statistical. It's the potential reduction in uncertainty that message can provide. It's the entropy of the channel. It's calculated from the probability of the possible messages.

Brent

--
You received this message because you are subscribed to the Google Groups 
"Everything List" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to everything-list+unsubscr...@googlegroups.com.
To post to this group, send email to everything-list@googlegroups.com.
Visit this group at http://groups.google.com/group/everything-list.
For more options, visit https://groups.google.com/d/optout.

Reply via email to