Training v-Support Vector Classifiers: Theory and Algorithms
Top Cited Papers
- 1 September 2001
- journal article
- Published by MIT Press in Neural Computation
- Vol. 13 (9), 2119-2147
- https://doi.org/10.1162/089976601750399335
Abstract
The nu-support vector machine (nu-SVM) for classification proposed by Schölkopf, Smola, Williamson, and Bartlett (2000) has the advantage of using a parameter nu on controlling the number of support vectors. In this article, we investigate the relation between nu-SVM and C-SVM in detail. We show that in general they are two different problems with the same optimal solution set. Hence, we may expect that many numerical aspects of solving them are similar. However, compared to regular C-SVM, the formulation of nu-SVM is more complicated, so up to now there have been no effective methods for solving large-scale nu-SVM. We propose a decomposition method for nu-SVM that is competitive with existing methods for C-SVM. We also discuss the behavior of nu-SVM by some numerical experiments.Keywords
This publication has 6 references indexed in Scilit:
- Formulations of Support Vector Machines: A Note from an Optimization Point of ViewNeural Computation, 2001
- The analysis of decomposition methods for support vector machinesIEEE Transactions on Neural Networks, 2000
- New Support Vector AlgorithmsNeural Computation, 2000
- A fast iterative nearest point algorithm for support vector machine classifier designIEEE Transactions on Neural Networks, 2000
- Successive overrelaxation for support vector machinesIEEE Transactions on Neural Networks, 1999
- Interpolation of scattered data: Distance matrices and conditionally positive definite functionsConstructive Approximation, 1986