danceswithnumb...@gmail.com wrote:
1111000010101011
This equals
61611
This can be represented using
0-6 log2(7)*5= 14.0367746103 bits


1101010100001111
This equals 54543
This can be represented using
0-5 log2(6)*5= 12.9248125036 bits

You're missing something fundamental about what
entropy is in information theory.

It's meaningless to talk about the entropy of a single
message. Entropy is a function of the probability
distribution of *all* the messages you might want to
send.

What you calculated in your first example relates to
this situation: You want to send someone messages
consisting of 5 symbols drawn from the set {0, 1,
2, 3, 4, 5, 6}, where all such messages have equal
probability. In that case, you need and average of
about 14.03 bits for each message.

Note that this has essentially nothing to do with
the particular sequence of bits you started with.

Your second calculation was for a similar situation,
except that the set of symbols is just {0, 1, 2,
3, 4, 5}. There are fewer messages of length 5 that
can be constructed from that set, so the number of
bits needed is smaller.

In reality you can express 54543 with 10 bits.

Again, this statement is meaningless. You can't
say *anything* about the number of bits needed to
represent that particular number, without knowing
what *other* numbers you might want to represent.

--
Greg
--
https://mail.python.org/mailman/listinfo/python-list

Reply via email to