Gradient-based adaptation of general gaussian kernels

Research output: Contribution to journalJournal articleResearchpeer-review

Standard

Gradient-based adaptation of general gaussian kernels. / Glasmachers, Tobias; Igel, Christian.

In: Neural Computation, Vol. 17, No. 10, 2005, p. 2099-2105.

Research output: Contribution to journalJournal articleResearchpeer-review

Harvard

Glasmachers, T & Igel, C 2005, 'Gradient-based adaptation of general gaussian kernels', Neural Computation, vol. 17, no. 10, pp. 2099-2105. https://doi.org/10.1162/0899766054615635

APA

Glasmachers, T., & Igel, C. (2005). Gradient-based adaptation of general gaussian kernels. Neural Computation, 17(10), 2099-2105. https://doi.org/10.1162/0899766054615635

Vancouver

Glasmachers T, Igel C. Gradient-based adaptation of general gaussian kernels. Neural Computation. 2005;17(10):2099-2105. https://doi.org/10.1162/0899766054615635

Author

Glasmachers, Tobias ; Igel, Christian. / Gradient-based adaptation of general gaussian kernels. In: Neural Computation. 2005 ; Vol. 17, No. 10. pp. 2099-2105.

Bibtex

@article{9fc79769343145c1a220a910d77ca15e,
title = "Gradient-based adaptation of general gaussian kernels",
abstract = "Gradient-based optimizing of gaussian kernel functions is considered. The gradient for the adaptation of scaling and rotation of the input space is computed to achieve invariance against linear transformations. This is done by using the exponential map as a parameterization of the kernel parameter manifold. By restricting the optimization to a constant trace subspace, the kernel size can be controlled. This is, for example, useful to prevent overfitting when minimizing radius-margin generalization performance measures. The concepts are demonstrated by training hard margin support vector machines on toy data.",
author = "Tobias Glasmachers and Christian Igel",
year = "2005",
doi = "10.1162/0899766054615635",
language = "English",
volume = "17",
pages = "2099--2105",
journal = "Neural Computation",
issn = "0899-7667",
publisher = "M I T Press",
number = "10",

}

RIS

TY - JOUR

T1 - Gradient-based adaptation of general gaussian kernels

AU - Glasmachers, Tobias

AU - Igel, Christian

PY - 2005

Y1 - 2005

N2 - Gradient-based optimizing of gaussian kernel functions is considered. The gradient for the adaptation of scaling and rotation of the input space is computed to achieve invariance against linear transformations. This is done by using the exponential map as a parameterization of the kernel parameter manifold. By restricting the optimization to a constant trace subspace, the kernel size can be controlled. This is, for example, useful to prevent overfitting when minimizing radius-margin generalization performance measures. The concepts are demonstrated by training hard margin support vector machines on toy data.

AB - Gradient-based optimizing of gaussian kernel functions is considered. The gradient for the adaptation of scaling and rotation of the input space is computed to achieve invariance against linear transformations. This is done by using the exponential map as a parameterization of the kernel parameter manifold. By restricting the optimization to a constant trace subspace, the kernel size can be controlled. This is, for example, useful to prevent overfitting when minimizing radius-margin generalization performance measures. The concepts are demonstrated by training hard margin support vector machines on toy data.

U2 - 10.1162/0899766054615635

DO - 10.1162/0899766054615635

M3 - Journal article

C2 - 16105219

VL - 17

SP - 2099

EP - 2105

JO - Neural Computation

JF - Neural Computation

SN - 0899-7667

IS - 10

ER -

ID: 32645794