Abstract
Let Q be a compact in the Euclidean space Ep , μ (dx) a measure defined on Qand F,G – two linear finite-dimensional subspaces of a -space of continuous functions on Q. Let f(x) be an unknown function from F and let the -distance of f(x).from G be known not to exceed a given constant α2. Given the measurements of f x with uncorrelated errors on a set of points in Q, the -norm optimal linear estimation of fF as a function of G is formulated as a minimax problem. Where the maximum is taken over all fF with . This estimation problem is reduced to the matrix problem where λ(XTX) is the maximal eigenvalue of the matrix XTX. Basing on this matrix problem the existence of the solution, some properties, bounds and suboptimal solutions for the initial estimation problem are obtained.

This publication has 4 references indexed in Scilit: