Steve D'Aprano wrote:
I don't think that's right. The entropy of a single message is a well-defined
quantity, formally called the self-information.
https://en.wikipedia.org/wiki/Self-information

True, but it still depends on knowing (or assuming) the
probability of getting that particular message out of
the set of all possible messages.

This is *not* what danceswithnumbers did when he
calculated the "entropy" of his example bit sequences.
He didn't define the set they were drawn from or
what their probabilities were.

--
Greg
--
https://mail.python.org/mailman/listinfo/python-list

Reply via email to