Entropy and Correlation: Some Comments
- 1 May 1987
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Systems, Man, and Cybernetics
- Vol. 17 (3), 517-519
- https://doi.org/10.1109/tsmc.1987.4309069
Abstract
For measuring the degree of association or correlation between two nominal variables, a measure based on informational entropy is presented as being preferable to that proposed recently by Horibe [1]. Asymptotic developments are also presented that may be used for making approximate statistical inferences about the population measure when the sample size is reasonably large. The use of this methodology is illustrated using a numerical example.Keywords
This publication has 4 references indexed in Scilit:
- Entropy and correlationIEEE Transactions on Systems, Man, and Cybernetics, 1985
- The Analysis of Contingency TablesPublished by Springer Nature ,1977
- The asymptotic standard errors of some estimates of uncertainty in the two-way contingency tablePsychometrika, 1975
- Measures of Association for Cross Classifications, IV: Simplification of Asymptotic VariancesJournal of the American Statistical Association, 1972