Hierarchical Variable Selection in Polynomial Regression Models
- 1 November 1987
- journal article
- research article
- Published by Informa UK Limited in The American Statistician
- Vol. 41 (4), 311-313
- https://doi.org/10.1080/00031305.1987.10475506
Abstract
Significance tests on coefficients of lower-order terms in polynomial regression models are affected by linear transformations. For this reason, a polynomial regression model that excludes hierarchically inferior predictors (i.e., lower-order terms) is considered to be not well formulated. Existing variable-selection algorithms do not take into account the hierarchy of predictors and often select as “best” a model that is not hierarchically well formulated. This article proposes a theory of the hierarchical ordering of the predictors of an arbitrary polynomial regression model in m variables, where m is any arbitrary positive integer. Ways of modifying existing algorithms to restrict their search to well-formulated models are suggested. An algorithm that generates all possible well-formulated models is presented.Keywords
This publication has 3 references indexed in Scilit:
- Testable hypotheses in singular fixed linear modelsCommunications in Statistics - Theory and Methods, 1986
- Linear Transformations of Polynomial Regression ModelsThe American Statistician, 1982
- The Interpretation of Least Squares Regression With Interaction or Polynomial TermsThe Review of Economics and Statistics, 1979