Network Information and Connected Correlations
Top Cited Papers
- 2 December 2003
- journal article
- research article
- Published by American Physical Society (APS) in Physical Review Letters
- Vol. 91 (23), 238701
- https://doi.org/10.1103/physrevlett.91.238701
Abstract
Entropy and information provide natural measures of correlation among elements in a network. We construct here the information theoretic analog of connected correlation functions: irreducible -point correlation is measured by a decrease in entropy for the joint distribution of variables relative to the maximum entropy allowed by all the observed variable distributions. We calculate the “connected information” terms for several examples and show that it also enables the decomposition of the information that is carried by a population of elements about an outside source.
Keywords
All Related Versions
This publication has 14 references indexed in Scilit:
- Almost Every Pure State of Three Qubits Is Completely Determined by Its Two-Particle Reduced Density MatricesPhysical Review Letters, 2002
- Information geometry on hierarchy of probability distributionsIEEE Transactions on Information Theory, 2001
- Neural Coding: Higher-Order Temporal Patterns in the Neurostatistics of Cell AssembliesNeural Computation, 2000
- A Generalizable Formulation of Conditional Logit with DiagnosticsJournal of the American Statistical Association, 1992
- $I$-Divergence Geometry of Probability Distributions and Minimization ProblemsThe Annals of Probability, 1975
- Maximum Entropy for Hypothesis Formulation, Especially for Multidimensional Contingency TablesThe Annals of Mathematical Statistics, 1963
- Information Theoretical Analysis of Multivariate CorrelationIBM Journal of Research and Development, 1960
- Information Theory and Statistical MechanicsPhysical Review B, 1957
- Multivariate information transmissionTransactions of the IRE Professional Group on Information Theory, 1954
- A Mathematical Theory of CommunicationBell System Technical Journal, 1948