Second-order SMO improves SVM online and active learning

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Standard

Second-order SMO improves SVM online and active learning. / Glasmachers, Tobias; Igel, Christian.

I: Neural Computation, Bind 20, Nr. 2, 2008, s. 374-382.

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Harvard

Glasmachers, T & Igel, C 2008, 'Second-order SMO improves SVM online and active learning', Neural Computation, bind 20, nr. 2, s. 374-382. https://doi.org/10.1162/neco.2007.10-06-354

APA

Glasmachers, T., & Igel, C. (2008). Second-order SMO improves SVM online and active learning. Neural Computation, 20(2), 374-382. https://doi.org/10.1162/neco.2007.10-06-354

Vancouver

Glasmachers T, Igel C. Second-order SMO improves SVM online and active learning. Neural Computation. 2008;20(2):374-382. https://doi.org/10.1162/neco.2007.10-06-354

Author

Glasmachers, Tobias ; Igel, Christian. / Second-order SMO improves SVM online and active learning. I: Neural Computation. 2008 ; Bind 20, Nr. 2. s. 374-382.

Bibtex

@article{85493443e57f46e5ad4da106e74cf988,
title = "Second-order SMO improves SVM online and active learning",
abstract = "Iterative learning algorithms that approximate the solution of support vector machines (SVMs) have two potential advantages. First, they allow online and active learning. Second, for large data sets, computing the exact SVM solution may be too time-consuming, and an efficient approximation can be preferable. The powerful LASVM iteratively approaches the exact SVM solution using sequential minimal optimization (SMO). It allows efficient online and active learning. Here, this algorithm is considerably improved in speed and accuracy by replacing the working set selection in the SMO steps. A second-order working set selection strategy, which greedily aims at maximizing the progress in each single step, is incorporated.",
author = "Tobias Glasmachers and Christian Igel",
year = "2008",
doi = "10.1162/neco.2007.10-06-354",
language = "English",
volume = "20",
pages = "374--382",
journal = "Neural Computation",
issn = "0899-7667",
publisher = "M I T Press",
number = "2",

}

RIS

TY - JOUR

T1 - Second-order SMO improves SVM online and active learning

AU - Glasmachers, Tobias

AU - Igel, Christian

PY - 2008

Y1 - 2008

N2 - Iterative learning algorithms that approximate the solution of support vector machines (SVMs) have two potential advantages. First, they allow online and active learning. Second, for large data sets, computing the exact SVM solution may be too time-consuming, and an efficient approximation can be preferable. The powerful LASVM iteratively approaches the exact SVM solution using sequential minimal optimization (SMO). It allows efficient online and active learning. Here, this algorithm is considerably improved in speed and accuracy by replacing the working set selection in the SMO steps. A second-order working set selection strategy, which greedily aims at maximizing the progress in each single step, is incorporated.

AB - Iterative learning algorithms that approximate the solution of support vector machines (SVMs) have two potential advantages. First, they allow online and active learning. Second, for large data sets, computing the exact SVM solution may be too time-consuming, and an efficient approximation can be preferable. The powerful LASVM iteratively approaches the exact SVM solution using sequential minimal optimization (SMO). It allows efficient online and active learning. Here, this algorithm is considerably improved in speed and accuracy by replacing the working set selection in the SMO steps. A second-order working set selection strategy, which greedily aims at maximizing the progress in each single step, is incorporated.

U2 - 10.1162/neco.2007.10-06-354

DO - 10.1162/neco.2007.10-06-354

M3 - Journal article

C2 - 18045012

VL - 20

SP - 374

EP - 382

JO - Neural Computation

JF - Neural Computation

SN - 0899-7667

IS - 2

ER -

ID: 32645826