The only virtue of using the natural base is that you get a nice asymptotic
distribution for random data.




On Fri, Apr 12, 2013 at 1:10 AM, Sean Owen <sro...@gmail.com> wrote:

> Yes that's true, it is more usually bits. Here it's natural log / nats.
> Since it's unnormalized anyway another constant factor doesn't hurt and it
> means not having to change the base.
>
>
> On Fri, Apr 12, 2013 at 8:01 AM, Phoenix Bai <baizh...@gmail.com> wrote:
>
>> I got 168, because I use log base 2 instead of e.
>> ([?]) if memory serves right, I read it in entropy definition that people
>> normally use base 2, so I just assumed it was 2 in code. (my bad).
>>
>> And now I have a better understanding, so thank you both for the
>> explanation.
>>
>>

Reply via email to