Maximum-entropy distributions having prescribed first and second moments (Corresp.)
- 1 September 1973
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 19 (5), 689-693
- https://doi.org/10.1109/tit.1973.1055060
Abstract
The entropyHof an absolutely continuous distribution with probability density functionp(x)is defined asH = - \int p(x) \log p(x) dx. The formal maximization ofH, subject to the moment constraints\int x^r p(x) dx = \mu_r, r = 0,1,\cdots,m, leads top(x) = \exp (- \sum_{r=0}^m \lamnbda_r x^r), where the\lambda_rhave to be chosen so as to satisfy the moment constraints. Only the casem = 2is considered. It is shown that whenxhas finite range, a distribution maximizing the entropy exists and is unique. When the range is[0,\infty), the maximum-entropy distribution exists if, and only if,\mu_2 \leq 2 \mu_1^2, and a table is given which enables the maximum-entropy distribution to be computed. The case\mu_2 > 2 \mu_1^2is discussed in some detail.Keywords
This publication has 2 references indexed in Scilit:
- Fitting continuous probability density functions over [0,infty) using information theory ideas (Corresp.)IEEE Transactions on Information Theory, 1970
- On Some Functions Involving Mill's RatioThe Annals of Mathematical Statistics, 1954