A Neural Support Vector Network architecture with adaptive kernels
- 1 January 2000
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- Vol. 5 (10987576), 187-192 vol.5
- https://doi.org/10.1109/ijcnn.2000.861455
Abstract
In the Support Vector Machines (SVM) framework, the positive-definite kernel can be seen as representing a fixed similarity measure between two patterns, and a discriminant function is obtained by taking a linear combination of the kernels computed at training examples called support vectors. We investigate learning architectures in which the kernel functions can be replaced by more general similarity measures that can have arbitrary internal parameters. The training criterion used in SVMs is not appropriate for this purpose so we adopt the simple criterion that is generally used when training neural networks for classification tasks. Several experiments are performed which show that such Neural Support Vector Networks perform similarly to SVMs while requiring significantly fewer support vectors, even when the similarity measure has no internal parameters.Keywords
This publication has 8 references indexed in Scilit:
- Support vector machines for histogram-based image classificationIEEE Transactions on Neural Networks, 1999
- Improving support vector machine classifiers by modifying kernel functionsNeural Networks, 1999
- Boosting the margin: a new explanation for the effectiveness of voting methodsThe Annals of Statistics, 1998
- The Nature of Statistical Learning TheoryPublished by Springer Nature ,1995
- Similarity Metric Learning for a Variable-Kernel ClassifierNeural Computation, 1995
- Support-Vector NetworksMachine Learning, 1995
- Neural Networks for Fingerprint RecognitionNeural Computation, 1993
- A training algorithm for optimal margin classifiersPublished by Association for Computing Machinery (ACM) ,1992