DeLTA seminar by Yi-Shan Wu
Speaker
Yi-Shan Wu, DIKU
Title
Split-kl and PAC-Bayes-split-kl Inequalities
Abstract
We present a new concentration of measure inequality for sums of independent bounded random variables, which we name a split-kl inequality. The inequality combines the combinatorial power of the kl inequality with ability to exploit low variance. While for Bernoulli random variables the kl inequality is tighter than the Empirical Bernstein, for random variables taking values inside a bounded interval and having low variance the Empirical Bernstein inequality is tighter than the kl. The proposed split-kl inequality yields the best of both worlds. We discuss an application of the split-kl inequality to bounding excess losses. We also derive a PAC-Bayes-split-kl inequality and use a synthetic example and several UCI datasets to compare it with the PAC-Bayes-kl, PAC-Bayes Empirical Bernstein, PAC-Bayes Unexpected Bernstein, and PAC-Bayes Empirical Bennett inequalities.
Based on joint work with Yevgeny Seldin.
Paper link: https://arxiv.org/abs/2206.00706 (to appear in NeurIPS-2022)
-------------------
You can subscribe to the DeLTA Seminar mailing list by sending an empty email to delta-seminar-join@list.ku.dk.
Online calendar: https://calendar.google.com/calendar/embed?src=c_bm6u2c38ec3ti4lbfjd13c2aqg%40group.calendar.google.com&ctz=Europe%2FCopenhagen
DeLTA Lab page: https://sites.google.com/diku.edu/delta
DeLTA is a research group affiliated with the Department of Computer Science at the University of Copenhagen studying diverse aspects of Machine Learning Theory and its applications, including, but not limited to Reinforcement Learning, Online Learning and Bandits, PAC-Bayesian analysis