Abstract
An inversion procedure which provides the most conservative inference for an unknown function in terms of partial data is discussed on the basis of information theoretic considerations. The method is based on the procedure of maximal entropy, but is not limited to the estimation of unknown probabilities. Rather, inductive inferences can be drawn regarding the values of general (if necessary, dimension-bearing) variables. The solution of an inversion problem using data linear in the unknown function is discussed in detail and explicit results are obtained. For every class of problems with common symmetry properties, the general algorithm can be reduced to a more direct procedure. When the data consist of average values for an unknown distribution, the general approach is in the spirit of the Darwin-Fowler method, while the reduced route is the procedure of maximal entropy ('method of most probable distribution') as usually employed in statistical mechanics. Other classes of problems discussed include the representation of an unknown function in a complete orthonormal basis using as input a partial set of expansion coefficients, and the inference of line shapes and power spectra.

This publication has 17 references indexed in Scilit: