Oblivious sketching of high-degree polynomial kernels

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Standard

Oblivious sketching of high-degree polynomial kernels. / Ahle, Thomas D.; Kapralov, Michael; Knudsen, Jakob B.T.; Pagh, Rasmus; Velingker, Ameya; Woodruff, David P.; Zandieh, Amir.

31st Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2020. ed. / Shuchi Chawla. Association for Computing Machinery, 2020. p. 141-160.

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Harvard

Ahle, TD, Kapralov, M, Knudsen, JBT, Pagh, R, Velingker, A, Woodruff, DP & Zandieh, A 2020, Oblivious sketching of high-degree polynomial kernels. in S Chawla (ed.), 31st Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2020. Association for Computing Machinery, pp. 141-160, 31st Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2020, Salt Lake City, United States, 05/01/2020.

APA

Ahle, T. D., Kapralov, M., Knudsen, J. B. T., Pagh, R., Velingker, A., Woodruff, D. P., & Zandieh, A. (2020). Oblivious sketching of high-degree polynomial kernels. In S. Chawla (Ed.), 31st Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2020 (pp. 141-160). Association for Computing Machinery.

Vancouver

Ahle TD, Kapralov M, Knudsen JBT, Pagh R, Velingker A, Woodruff DP et al. Oblivious sketching of high-degree polynomial kernels. In Chawla S, editor, 31st Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2020. Association for Computing Machinery. 2020. p. 141-160

Author

Ahle, Thomas D. ; Kapralov, Michael ; Knudsen, Jakob B.T. ; Pagh, Rasmus ; Velingker, Ameya ; Woodruff, David P. ; Zandieh, Amir. / Oblivious sketching of high-degree polynomial kernels. 31st Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2020. editor / Shuchi Chawla. Association for Computing Machinery, 2020. pp. 141-160

Bibtex

@inproceedings{020d1f6460424d1fb00957fcfd44aa37,
title = "Oblivious sketching of high-degree polynomial kernels",
abstract = "Kernel methods are fundamental tools in machine learning that allow detection of non-linear dependencies between data without explicitly constructing feature vectors in high dimensional spaces. A major disadvantage of kernel methods is their poor scalability: primitives such as kernel PCA or kernel ridge regression generally take prohibitively large quadratic space and (at least) quadratic time, as kernel matrices are usually dense. Some methods for speeding up kernel linear algebra are known, but they all invariably take time exponential in either the dimension of the input point set (e.g., fast multipole methods suffer from the curse of dimensionality) or in the degree of the kernel function. Oblivious sketching has emerged as a powerful approach to speeding up numerical linear algebra over the past decade, but our understanding of oblivious sketching solutions for kernel matrices has remained quite limited, suffering from the aforementioned exponential dependence on input parameters. Our main contribution is a general method for applying sketching solutions developed in numerical linear algebra over the past decade to a tensoring of data points without forming the tensoring explicitly. This leads to the first oblivious sketch for the polynomial kernel with a target dimension that is only polynomially dependent on the degree of the kernel function, as well as the first oblivious sketch for the Gaussian kernel on bounded datasets that does not suffer from an exponential dependence on the dimensionality of input data points.",
author = "Ahle, {Thomas D.} and Michael Kapralov and Knudsen, {Jakob B.T.} and Rasmus Pagh and Ameya Velingker and Woodruff, {David P.} and Amir Zandieh",
year = "2020",
language = "English",
pages = "141--160",
editor = "Shuchi Chawla",
booktitle = "31st Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2020",
publisher = "Association for Computing Machinery",
note = "31st Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2020 ; Conference date: 05-01-2020 Through 08-01-2020",

}

RIS

TY - GEN

T1 - Oblivious sketching of high-degree polynomial kernels

AU - Ahle, Thomas D.

AU - Kapralov, Michael

AU - Knudsen, Jakob B.T.

AU - Pagh, Rasmus

AU - Velingker, Ameya

AU - Woodruff, David P.

AU - Zandieh, Amir

PY - 2020

Y1 - 2020

N2 - Kernel methods are fundamental tools in machine learning that allow detection of non-linear dependencies between data without explicitly constructing feature vectors in high dimensional spaces. A major disadvantage of kernel methods is their poor scalability: primitives such as kernel PCA or kernel ridge regression generally take prohibitively large quadratic space and (at least) quadratic time, as kernel matrices are usually dense. Some methods for speeding up kernel linear algebra are known, but they all invariably take time exponential in either the dimension of the input point set (e.g., fast multipole methods suffer from the curse of dimensionality) or in the degree of the kernel function. Oblivious sketching has emerged as a powerful approach to speeding up numerical linear algebra over the past decade, but our understanding of oblivious sketching solutions for kernel matrices has remained quite limited, suffering from the aforementioned exponential dependence on input parameters. Our main contribution is a general method for applying sketching solutions developed in numerical linear algebra over the past decade to a tensoring of data points without forming the tensoring explicitly. This leads to the first oblivious sketch for the polynomial kernel with a target dimension that is only polynomially dependent on the degree of the kernel function, as well as the first oblivious sketch for the Gaussian kernel on bounded datasets that does not suffer from an exponential dependence on the dimensionality of input data points.

AB - Kernel methods are fundamental tools in machine learning that allow detection of non-linear dependencies between data without explicitly constructing feature vectors in high dimensional spaces. A major disadvantage of kernel methods is their poor scalability: primitives such as kernel PCA or kernel ridge regression generally take prohibitively large quadratic space and (at least) quadratic time, as kernel matrices are usually dense. Some methods for speeding up kernel linear algebra are known, but they all invariably take time exponential in either the dimension of the input point set (e.g., fast multipole methods suffer from the curse of dimensionality) or in the degree of the kernel function. Oblivious sketching has emerged as a powerful approach to speeding up numerical linear algebra over the past decade, but our understanding of oblivious sketching solutions for kernel matrices has remained quite limited, suffering from the aforementioned exponential dependence on input parameters. Our main contribution is a general method for applying sketching solutions developed in numerical linear algebra over the past decade to a tensoring of data points without forming the tensoring explicitly. This leads to the first oblivious sketch for the polynomial kernel with a target dimension that is only polynomially dependent on the degree of the kernel function, as well as the first oblivious sketch for the Gaussian kernel on bounded datasets that does not suffer from an exponential dependence on the dimensionality of input data points.

UR - http://www.scopus.com/inward/record.url?scp=85084087759&partnerID=8YFLogxK

M3 - Article in proceedings

AN - SCOPUS:85084087759

SP - 141

EP - 160

BT - 31st Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2020

A2 - Chawla, Shuchi

PB - Association for Computing Machinery

T2 - 31st Annual ACM-SIAM Symposium on Discrete Algorithms, SODA 2020

Y2 - 5 January 2020 through 8 January 2020

ER -

ID: 258720675