SMO Algorithm for Least-Squares SVM Formulations
- 1 February 2003
- journal article
- Published by MIT Press in Neural Computation
- Vol. 15 (2), 487-507
- https://doi.org/10.1162/089976603762553013
Abstract
This article extends the well-known SMO algorithm of support vector machines (SVMs) to least-squares SVM formulations that include LS-SVM classification, kernel ridge regression, and a particular form of regularized kernel Fisher discriminant. The algorithm is shown to be asymptotically convergent. It is also extremely easy to implement. Computational experiments show that the algorithm is fast and scales efficiently (quadratically) as a function of the number of examples.Keywords
This publication has 3 references indexed in Scilit:
- Bayesian Framework for Least-Squares Support Vector Machine Classifiers, Gaussian Processes, and Kernel Fisher Discriminant AnalysisNeural Computation, 2002
- Improvements to Platt's SMO Algorithm for SVM Classifier DesignNeural Computation, 2001
- Least Squares Support Vector Machine ClassifiersNeural Processing Letters, 1999