Mutual information, metric entropy and cumulative relative entropy risk
Open Access
- 1 December 1997
- journal article
- Published by Institute of Mathematical Statistics in The Annals of Statistics
- Vol. 25 (6)
- https://doi.org/10.1214/aos/1030741081
Abstract
No abstract availableThis publication has 39 references indexed in Scilit:
- Probability Inequalities for Likelihood Ratios and Convergence Rates of Sieve MLESThe Annals of Statistics, 1995
- Rates of convergence for minimum contrast estimatorsProbability Theory and Related Fields, 1993
- Density estimation by stochastic complexityIEEE Transactions on Information Theory, 1992
- On Density Estimation in the View of Kolmogorov's Ideas in Approximation TheoryThe Annals of Statistics, 1990
- Information-theoretic asymptotics of Bayes methodsIEEE Transactions on Information Theory, 1990
- Stochastic Complexity and ModelingThe Annals of Statistics, 1986
- On the Consistency of Bayes EstimatesThe Annals of Statistics, 1986
- Differential Geometry of Curved Exponential Families-Curvatures and Information LossThe Annals of Statistics, 1982
- A source matching approach to finding minimax codesIEEE Transactions on Information Theory, 1980
- Entropies of several sets of real valued functionsPacific Journal of Mathematics, 1963