On theepsilon-entropy and the rate-distortion function of certain non-Gaussian processes
- 1 July 1974
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Information Theory
- Vol. 20 (4), 517-524
- https://doi.org/10.1109/tit.1974.1055249
Abstract
Let\xi = \{\xi(t), 0 \leq t \leq T\}be a process with covariance functionK(s,t)andE \int_0^T \xi^2(t) dt < \infty. It is proved that for every\varepsilon > 0the\varepsilon-entropyH_{\varepsilon}(\xi)satisfies \begin{equation} H_{\varepsilon}(\xi_g) - \mathcal{H}_{\xi_g} (\xi) \leq H_{\varepsilon}(\xi) \leq H_{\varepsilon}(\xi_g) \end{equation} where\xi_gis a Gaussian process with the covarianeeK(s,t)and\mathcal{H}_{\xi_g}(\xi)is the entropy of the measure induced by\xi(in function space) with respect to that induced by\xi_g. It is also shown that if\mathcal{H}_{\xi_g}(\xi) < \inftythen, as\varepsilon \rightarrow 0\begin{equation} H_{\varepsilon}(\xi) = H_{\varepsilon}(\xi_g) - \mathcal{H}_{\xi_g}(\xi) + o(1). \end{equation} Furthermore, ff there exists a Gaussian processg = \{ g(t); 0 \leq t \leq T \}such that\mathcal{H}_g(\xi) < \infty, then the ratio betweenH_{\varepsilon}(\xi)andH_{\varepsilon}(g)goes to one as\varepsilongoes to zero. Similar results are given for the rate-distortion function, and some particular examples are worked out in detail. Some cases for which\mathcal_{\xi_g}(\xi) = \inftyare discussed, and asymptotic bounds onH_{\varepsilon}(\xi), expressed in terms ofH_{\varepsilon}(\xi_g), are derived.Keywords
This publication has 9 references indexed in Scilit:
- On theepsilon-entropy of certain Gaussian processesIEEE Transactions on Information Theory, 1974
- Bounds on theepsilon-entropy of Wiener andRCprocesses (Corresp.)IEEE Transactions on Information Theory, 1973
- Lower and upper bounds on the optimal filtering error of certain diffusion processesIEEE Transactions on Information Theory, 1972
- The Structure of Radon-Nikodym Derivatives with Respect to Wiener and Related MeasuresThe Annals of Mathematical Statistics, 1971
- On ε-Entropy of Equivalent Gaussian ProcessesNagoya Mathematical Journal, 1970
- On the ε-entropy of diffusion processesAnnals of the Institute of Statistical Mathematics, 1969
- Evaluation of likelihood functionsInformation and Control, 1968
- Information rates of non-Gaussian processesIEEE Transactions on Information Theory, 1964
- On the Shannon theory of information transmission in the case of continuous signalsIRE Transactions on Information Theory, 1956