Split-kl and PAC-Bayes-split-kl Inequalities for Ternary Random Variables

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Standard

Split-kl and PAC-Bayes-split-kl Inequalities for Ternary Random Variables. / Wu, Yi-Shan; Seldin, Yevgeny.

Advances in Neural Information Processing Systems 35 (NeurIPS 2022). NeurIPS Proceedings, 2022. s. 11369-11381.

Publikation: Bidrag til bog/antologi/rapportKonferencebidrag i proceedingsForskningfagfællebedømt

Harvard

Wu, Y-S & Seldin, Y 2022, Split-kl and PAC-Bayes-split-kl Inequalities for Ternary Random Variables. i Advances in Neural Information Processing Systems 35 (NeurIPS 2022). NeurIPS Proceedings, s. 11369-11381, 36th Conference on Neural Information Processing Systems (NeurIPS 2022)., New Orleans/ Virtual, USA, 28/11/2022. <https://proceedings.neurips.cc/paper_files/paper/2022/file/49ffa271264808cf500ea528ed8ec9b3-Supplemental-Conference.pdf>

APA

Wu, Y-S., & Seldin, Y. (2022). Split-kl and PAC-Bayes-split-kl Inequalities for Ternary Random Variables. I Advances in Neural Information Processing Systems 35 (NeurIPS 2022) (s. 11369-11381). NeurIPS Proceedings. https://proceedings.neurips.cc/paper_files/paper/2022/file/49ffa271264808cf500ea528ed8ec9b3-Supplemental-Conference.pdf

Vancouver

Wu Y-S, Seldin Y. Split-kl and PAC-Bayes-split-kl Inequalities for Ternary Random Variables. I Advances in Neural Information Processing Systems 35 (NeurIPS 2022). NeurIPS Proceedings. 2022. s. 11369-11381

Author

Wu, Yi-Shan ; Seldin, Yevgeny. / Split-kl and PAC-Bayes-split-kl Inequalities for Ternary Random Variables. Advances in Neural Information Processing Systems 35 (NeurIPS 2022). NeurIPS Proceedings, 2022. s. 11369-11381

Bibtex

@inproceedings{e9eaba82a3a2418bb226a5710884b2c7,
title = "Split-kl and PAC-Bayes-split-kl Inequalities for Ternary Random Variables",
abstract = "We present a new concentration of measure inequality for sums of independent bounded random variables, which we name a split-kl inequality. The inequality is particularly well-suited for ternary random variables, which naturally show up in a variety of problems, including analysis of excess losses in classification, analysis of weighted majority votes, and learning with abstention. We demonstrate that for ternary random variables the inequality is simultaneously competitive with the kl inequality, the Empirical Bernstein inequality, and the Unexpected Bernstein inequality, and in certain regimes outperforms all of them. It resolves an open question by Tolstikhin and Seldin [2013] and Mhammedi et al. [2019] on how to match simultaneously the combinatorial power of the kl inequality when the distribution happens to be close to binary and the power of Bernstein inequalities to exploit low variance when the probability mass is concentrated on the middle value. We also derive a PAC-Bayes-split-kl inequality and compare it with the PAC-Bayes-kl, PAC-Bayes-Empirical-Bennett, and PAC-Bayes-Unexpected-Bernstein inequalities in an analysis of excess losses and in an analysis of a weighted majority vote for several UCI datasets. Last but not least, our study provides the first direct comparison of the Empirical Bernstein and Unexpected Bernstein inequalities and their PAC-Bayes extensions.",
author = "Yi-Shan Wu and Yevgeny Seldin",
year = "2022",
language = "English",
pages = "11369--11381",
booktitle = "Advances in Neural Information Processing Systems 35 (NeurIPS 2022)",
publisher = "NeurIPS Proceedings",
note = "null ; Conference date: 28-11-2022 Through 09-12-2022",

}

RIS

TY - GEN

T1 - Split-kl and PAC-Bayes-split-kl Inequalities for Ternary Random Variables

AU - Wu, Yi-Shan

AU - Seldin, Yevgeny

PY - 2022

Y1 - 2022

N2 - We present a new concentration of measure inequality for sums of independent bounded random variables, which we name a split-kl inequality. The inequality is particularly well-suited for ternary random variables, which naturally show up in a variety of problems, including analysis of excess losses in classification, analysis of weighted majority votes, and learning with abstention. We demonstrate that for ternary random variables the inequality is simultaneously competitive with the kl inequality, the Empirical Bernstein inequality, and the Unexpected Bernstein inequality, and in certain regimes outperforms all of them. It resolves an open question by Tolstikhin and Seldin [2013] and Mhammedi et al. [2019] on how to match simultaneously the combinatorial power of the kl inequality when the distribution happens to be close to binary and the power of Bernstein inequalities to exploit low variance when the probability mass is concentrated on the middle value. We also derive a PAC-Bayes-split-kl inequality and compare it with the PAC-Bayes-kl, PAC-Bayes-Empirical-Bennett, and PAC-Bayes-Unexpected-Bernstein inequalities in an analysis of excess losses and in an analysis of a weighted majority vote for several UCI datasets. Last but not least, our study provides the first direct comparison of the Empirical Bernstein and Unexpected Bernstein inequalities and their PAC-Bayes extensions.

AB - We present a new concentration of measure inequality for sums of independent bounded random variables, which we name a split-kl inequality. The inequality is particularly well-suited for ternary random variables, which naturally show up in a variety of problems, including analysis of excess losses in classification, analysis of weighted majority votes, and learning with abstention. We demonstrate that for ternary random variables the inequality is simultaneously competitive with the kl inequality, the Empirical Bernstein inequality, and the Unexpected Bernstein inequality, and in certain regimes outperforms all of them. It resolves an open question by Tolstikhin and Seldin [2013] and Mhammedi et al. [2019] on how to match simultaneously the combinatorial power of the kl inequality when the distribution happens to be close to binary and the power of Bernstein inequalities to exploit low variance when the probability mass is concentrated on the middle value. We also derive a PAC-Bayes-split-kl inequality and compare it with the PAC-Bayes-kl, PAC-Bayes-Empirical-Bennett, and PAC-Bayes-Unexpected-Bernstein inequalities in an analysis of excess losses and in an analysis of a weighted majority vote for several UCI datasets. Last but not least, our study provides the first direct comparison of the Empirical Bernstein and Unexpected Bernstein inequalities and their PAC-Bayes extensions.

M3 - Article in proceedings

SP - 11369

EP - 11381

BT - Advances in Neural Information Processing Systems 35 (NeurIPS 2022)

PB - NeurIPS Proceedings

Y2 - 28 November 2022 through 9 December 2022

ER -

ID: 383100686