Mutual information (MI) between two variables is a natural
measure for dependency. It is zero for independent variables,
expresses not only 2nd order statistical relations, but all,
and can be computed either through histograms or by first
estimating the densities of the variables (using something
like Parzen windows, for example).

Cheers,

- Kari

Reply via email to