Abstract
$C_p, C_L$, cross-validation and generalized cross-validation are useful data-driven techniques for selecting a good estimate from a proposed class of linear estimates. The asymptotic behaviors of these procedures are studied. Some easily interpretable conditions are derived to demonstrate the asymptotic optimality. It is argued that cross-validation and generalized cross-validation can be viewed as some special ways of applying $C_L$. Applications in nearest-neighbor nonparametric regression and in model selection are discussed in detail.