A general minimax result for relative entropy
- 1 July 1997
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 43 (4), 1276-1280
- https://doi.org/10.1109/18.605594
Abstract
Suppose nature picks a probability measure P/sub /spl theta// on a complete separable metric space X at random from a measurable set P/sub /spl Theta//={P/spl theta/:/spl theta//spl isin//spl Theta/}. Then, without knowing /spl theta/, a statistician picks a measure Q on S. Finally, the statistician suffers a loss D(P/sub 0//spl par/Q), the relative entropy between P/sub /spl theta// and Q. We show that the minimax and maximin values of this game are always equal, and there is always a minimax strategy in the closure of the set of all Bayes strategies. This generalizes previous results of Gallager(1979), and Davisson and Leon-Garcia (1980).Keywords
This publication has 10 references indexed in Scilit:
- Elements of Information TheoryPublished by Wiley ,2001
- A strong version of the redundancy-capacity theorem of universal codingIEEE Transactions on Information Theory, 1995
- Jeffreys' prior is asymptotically least favorable under entropy riskJournal of Statistical Planning and Inference, 1994
- Bounds on the sample complexity of Bayesian learning using information theory and the VC dimensionMachine Learning, 1994
- Information-theoretic asymptotics of Bayes methodsIEEE Transactions on Information Theory, 1990
- A bound on the financial value of informationIEEE Transactions on Information Theory, 1988
- A source matching approach to finding minimax codesIEEE Transactions on Information Theory, 1980
- Random coding strategies for minimum entropyIEEE Transactions on Information Theory, 1975
- Universal noiseless codingIEEE Transactions on Information Theory, 1973
- An Extension of Wald's Theory of Statistical Decision FunctionsThe Annals of Mathematical Statistics, 1955