The mutual information as a measure of statistical dependence
- 22 November 2002
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
Abstract
The mutual information I, if appropriately normalised, can serve as a measure of correlation. In encompassing nonlinear dependences, it generalises the classical measures of linear correlation. An efficient nonparametric estimator of I can be derived from Dobrushin's (1963) information theorem.Keywords
This publication has 1 reference indexed in Scilit:
- On measures of dependenceActa Mathematica Hungarica, 1959