When repetitive estimations are to be made under field conditions using data that follow a nonlinear regression law, a simple polynomial function of the observations has considerable appeal as an estimator. The polynomial estimator of finite degree with smallest average mean squared error is found. Conditions are given such that as degree increases it converges in probability to the Bayes estimator and its average mean squared error converges to the lower bound of all square integrable estimators. In an example, a linear estimator performs better than the maximum likelihood estimator and nearly as well as the Bayes estimator.