Abstract
Let\xi = \{\xi(t), 0 \leq t \leq T\}be a process with covariance functionK(s,t)andE \int_0^T \xi^2(t) dt < \infty. It is proved that for every\varepsilon > 0the\varepsilon-entropyH_{\varepsilon}(\xi)satisfies \begin{equation} H_{\varepsilon}(\xi_g) - \mathcal{H}_{\xi_g} (\xi) \leq H_{\varepsilon}(\xi) \leq H_{\varepsilon}(\xi_g) \end{equation} where\xi_gis a Gaussian process with the covarianeeK(s,t)and\mathcal{H}_{\xi_g}(\xi)is the entropy of the measure induced by\xi(in function space) with respect to that induced by\xi_g. It is also shown that if\mathcal{H}_{\xi_g}(\xi) < \inftythen, as\varepsilon \rightarrow 0\begin{equation} H_{\varepsilon}(\xi) = H_{\varepsilon}(\xi_g) - \mathcal{H}_{\xi_g}(\xi) + o(1). \end{equation} Furthermore, ff there exists a Gaussian processg = \{ g(t); 0 \leq t \leq T \}such that\mathcal{H}_g(\xi) < \infty, then the ratio betweenH_{\varepsilon}(\xi)andH_{\varepsilon}(g)goes to one as\varepsilongoes to zero. Similar results are given for the rate-distortion function, and some particular examples are worked out in detail. Some cases for which\mathcal_{\xi_g}(\xi) = \inftyare discussed, and asymptotic bounds onH_{\varepsilon}(\xi), expressed in terms ofH_{\varepsilon}(\xi_g), are derived.

This publication has 9 references indexed in Scilit: