Rotation Forest: A New Classifier Ensemble Method
Top Cited Papers
- 21 August 2006
- journal article
- research article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Pattern Analysis and Machine Intelligence
- Vol. 28 (10), 1619-1630
- https://doi.org/10.1109/tpami.2006.211
Abstract
We propose a method for generating classifier ensembles based on feature extraction. To create the training data for a base classifier, the feature set is randomly split into K subsets (K is a parameter of the algorithm) and principal component analysis (PCA) is applied to each subset. All principal components are retained in order to preserve the variability information in the data. Thus, K axis rotations take place to form the new features for a base classifier. The idea of the rotation approach is to encourage simultaneously individual accuracy and diversity within the ensemble. Diversity is promoted through the feature extraction for each base classifier. Decision trees were chosen here because they are sensitive to rotation of the feature axes, hence the name "forest". Accuracy is sought by keeping all principal components and also using the whole data set to train each base classifier. Using WEKA, we examined the rotation forest ensemble on a random selection of 33 benchmark data sets from the UCI repository and compared it with bagging, AdaBoost, and random forest. The results were favorable to rotation forest and prompted an investigation into diversity-accuracy landscape of the ensemble models. Diversity-error diagrams revealed that rotation forest ensembles construct individual classifiers which are more accurate than these in AdaBoost and random forest, and more diverse than these in bagging, sometimes more accurate as wellKeywords
This publication has 30 references indexed in Scilit:
- Diversity in multiple classifier systemsInformation Fusion, 2005
- Input decimated ensemblesPattern Analysis and Applications, 2003
- A Data Complexity Analysis of Comparative Advantages of Decision Forest ConstructorsPattern Analysis and Applications, 2002
- Additive logistic regression: a statistical view of boosting (With discussion and a rejoinder by the authors)The Annals of Statistics, 2000
- Boosting the margin: a new explanation for the effectiveness of voting methodsThe Annals of Statistics, 1998
- Arcing classifier (with discussion and a rejoinder by the author)The Annals of Statistics, 1998
- The random subspace method for constructing decision forestsIEEE Transactions on Pattern Analysis and Machine Intelligence, 1998
- A Decision-Theoretic Generalization of On-Line Learning and an Application to BoostingJournal of Computer and System Sciences, 1997
- Neural network ensemblesIEEE Transactions on Pattern Analysis and Machine Intelligence, 1990
- A new approach to feature selection based on the Karhunen-Loeve expansionPattern Recognition, 1973