Adaptive Model Selection

Abstract
Most model selection procedures use a fixed penalty penalizing an increase in the size of a model. These nonadaptive selection procedures perform well only in one type of situation. For instance, Bayesian information criterion (BIC) with a large penalty performs well for “small” models and poorly for “large” models, and Akaike's information criterion (AIC) does just the opposite. This article proposes an adaptive model selection procedure that uses a data-adaptive complexity penalty based on a concept of generalized degrees of freedom. The proposed procedure, combining the benefit of a class of nonadaptive procedures, approximates the best performance of this class of procedures across a variety of different situations. This class includes many well-known procedures, such as AIC, BIC, Mallows's Cp, and risk inflation criterion (RIC). The proposed procedure is applied to wavelet thresholding in nonparametric regression and variable selection in least squares regression. Simulation results and an asymptotic analysis support the effectiveness of the proposed procedure.

This publication has 13 references indexed in Scilit: