DeLTA seminar by Eduard Gorbunov

Speaker
Eduard Gorbunov, MBZUAI
Title
Communication-Efficient and Byzantine-Robust Distributed Learning
Abstract
Distributed learning has emerged as a leading paradigm for training large machine learning models. Despite its significant advantages, such as accelerated training and the ability to leverage diverse datasets, it also introduces unique challenges, particularly in communication efficiency and robustness to Byzantine attacks. In this talk, I will focus on both of these challenges. Specifically, I will discuss standard approaches to achieving communication efficiency and Byzantine robustness, highlight their limitations, and present two algorithmic solutions—DIANA and Byz-VR-MARINA—that address these shortcomings. Additionally, I will provide theoretical convergence guarantees for these methods and explore how their underlying principles can be further extended.
Join the DeLTA community
You can subscribe to the DeLTA Seminar mailing list by sending an empty email to delta-seminar-join@list.ku.dk<mailto:delta-seminar-join@list.ku.dk>
DeLTA online calendar
DeLTA Lab page