Mutual information (MI) between two variables is a natural measure for dependency. It is zero for independent variables, expresses not only 2nd order statistical relations, but all, and can be computed either through histograms or by first estimating the densities of the variables (using something like Parzen windows, for example). Cheers, - Kari
- measure dependency Christopher
- Re: measure dependency Herman Rubin
- Re: measure dependency Elliot Cramer
- Re: measure dependency Donald F. Burrill
- Re: measure dependency Rui Jorge Gonçalves
- Re: measure dependency Elliot Cramer
- Re: measure dependency Lloyd I. Richardson
- Re: measure dependency Elliot Cramer
- Re: measure dependency Kari Torkkola
- Re: measure dependency bkamen