Least angle regression
Top Cited Papers
Open Access
- 1 April 2004
- journal article
- Published by Institute of Mathematical Statistics in The Annals of Statistics
- Vol. 32 (2), 407-499
- https://doi.org/10.1214/009053604000000067
Abstract
The purpose of model selection algorithms such as All Subsets, Forward Selection and Backward Elimination is to choose a linear model on the basis of the same set of data to which the model will be applied. Typically we have available a large collection of possible covariates from which we hope to select a parsimonious set for the efficient prediction of a response variable. Least Angle Regression (LARS), a new model selection algorithm, is a useful and less greedy version of traditional forward selection methods. Three main properties are derived: (1) A simple modification of the LARS algorithm implements the Lasso, an attractive version of ordinary least squares that constrains the sum of the absolute regression coefficients; the LARS modification calculates all possible Lasso estimates for a given problem, using an order of magnitude less computer time than previous methods. (2) A different LARS modification efficiently implements Forward Stagewise linear regression, another promising new model selection method; this connection explains the similar numerical results previously observed for the Lasso and Stagewise, and helps us understand the properties of both methods, which are seen as constrained versions of the simpler LARS algorithm. (3) A simple approximation for the degrees of freedom of a LARS estimate is available, from which we derive a Cp estimate of prediction error; this allows a principled choice among the range of possible LARS estimates. LARS and its variants are computationally efficient: the paper describes a publicly available algorithm that requires only the same order of magnitude of computational effort as ordinary least squares applied to the full set of covariates.Keywords
All Related Versions
This publication has 44 references indexed in Scilit:
- On the LASSO and its DualJournal of Computational and Graphical Statistics, 2000
- Graphs in Statistical Analysis: Is the Medium the Message?The American Statistician, 1999
- Monotone Shrinkage of TreesJournal of Computational and Graphical Statistics, 1998
- Penalized Regressions: The Bridge versus the LassoJournal of Computational and Graphical Statistics, 1998
- Better Subset Regression Using the Nonnegative GarroteTechnometrics, 1995
- Variable Selection via Gibbs SamplingJournal of the American Statistical Association, 1993
- The Little Bootstrap and other Methods for Dimensionality Selection in Regression: X-Fixed Prediction ErrorJournal of the American Statistical Association, 1992
- Sliced Inverse Regression for Dimension ReductionJournal of the American Statistical Association, 1991
- Bayesian Variable Selection in Linear RegressionJournal of the American Statistical Association, 1988
- A Statistic for AllocatingCpto Individual CasesTechnometrics, 1981