Gradient-Based Adaptation of General Gaussian Kernels
- 1 October 2005
- journal article
- Published by MIT Press in Neural Computation
- Vol. 17 (10), 2099-2105
- https://doi.org/10.1162/0899766054615635
Abstract
Gradient-based optimizing of gaussian kernel functions is considered. The gradient for the adaptation of scaling and rotation of the input space is computed to achieve invariance against linear transformations. This is done by using the exponential map as a parameterization of the kernel parameter manifold. By restricting the optimization to a constant trace subspace, the kernel size can be controlled. This is, for example, useful to prevent overfitting when minimizing radius-margin generalization performance measures. The concepts are demonstrated by training hard margin support vector machines on toy data.Keywords
This publication has 5 references indexed in Scilit:
- Radius Margin Bounds for Support Vector Machines with the RBF KernelNeural Computation, 2003
- Model selection for support vector machine classificationNeurocomputing, 2003
- Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithmsIEEE Transactions on Neural Networks, 2002
- Optimization of the Kernel Functions in a Probabilistic Neural Network Analyzing the Local Pattern DistributionNeural Computation, 2002
- Choosing Multiple Parameters for Support Vector MachinesMachine Learning, 2002