Minimax Entropy Principle and Its Application to Texture Modeling
- 1 November 1997
- journal article
- Published by MIT Press in Neural Computation
- Vol. 9 (8), 1627-1660
- https://doi.org/10.1162/neco.1997.9.8.1627
Abstract
Summary:This article studies exponential families $\mathcal{E}$ on finite sets such that the information divergence $D(P\|\mathcal{E})$ of an arbitrary probability distribution from $\mathcal{E}$ is bounded by some constant $D>0$. A particular class of low-dimensional exponential families that have low values of $D$ can be obtained from partitions of the state space. The main results concern optimality properties of these partition exponential families. The case where $D=\log(2)$ is studied in detail. This case is special, because if $D<\log(2)$, then $\mathcal{E}$ contains all probability measures with full support
Keywords
This publication has 18 references indexed in Scilit:
- The Helmholtz MachineNeural Computation, 1995
- What Is the Goal of Sensory Coding?Neural Computation, 1994
- Hierarchical Mixtures of Experts and the EM AlgorithmNeural Computation, 1994
- Entropy-based algorithms for best basis selectionIEEE Transactions on Information Theory, 1992
- Finding Minimum Entropy CodesNeural Computation, 1989
- Uncertainty relation for resolution in space, spatial frequency, and orientation optimized by two-dimensional visual cortical filtersJournal of the Optical Society of America A, 1985
- Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of ImagesIEEE Transactions on Pattern Analysis and Machine Intelligence, 1984
- Markov Random Field Texture ModelsIEEE Transactions on Pattern Analysis and Machine Intelligence, 1983
- Visual Pattern DiscriminationIEEE Transactions on Information Theory, 1962
- Information Theory and Statistical MechanicsPhysical Review B, 1957