Ansatte – Københavns Universitet

Second-order SMO improves SVM online and active learning

Publikation: Bidrag til tidsskriftTidsskriftartikelForskningfagfællebedømt

Tobias Glasmachers, Christian Igel

Iterative learning algorithms that approximate the solution of support vector machines (SVMs) have two potential advantages. First, they allow online and active learning. Second, for large data sets, computing the exact SVM solution may be too time-consuming, and an efficient approximation can be preferable. The powerful LASVM iteratively approaches the exact SVM solution using sequential minimal optimization (SMO). It allows efficient online and active learning. Here, this algorithm is considerably improved in speed and accuracy by replacing the working set selection in the SMO steps. A second-order working set selection strategy, which greedily aims at maximizing the progress in each single step, is incorporated.
OriginalsprogEngelsk
TidsskriftNeural Computation
Vol/bind20
Udgave nummer2
Sider (fra-til)374-382
Antal sider9
ISSN0899-7667
DOI
StatusUdgivet - 2008
Eksternt udgivetJa

ID: 32645826