SMO Algorithm for Least-Squares SVM Formulations

Abstract
This article extends the well-known SMO algorithm of support vector machines (SVMs) to least-squares SVM formulations that include LS-SVM classification, kernel ridge regression, and a particular form of regularized kernel Fisher discriminant. The algorithm is shown to be asymptotically convergent. It is also extremely easy to implement. Computational experiments show that the algorithm is fast and scales efficiently (quadratically) as a function of the number of examples.