DeLTA seminar by Diksha Gupta

Delta seminar graphic

Participate on Zoom

Speaker

Diksha Gupta from INRIA center of Université Côte d'Azur, France

Title

Towards Scalable & Robust Federated Learning

Abstract

Federated learning (FL) enables collaborative model training across distributed data sources while preserving data privacy. Despite its promise, FL systems face critical challenges: the need for communication efficiency and robustness against adversarial threats, such as Byzantine participants that can deliberately disrupt the training process. In this talk, I will explore the intricate trade-offs between robustness and communication efficiency in FL systems. I will discuss innovative algorithms that enhance the system's resilience by employing robust aggregation techniques while maintaining efficient communication. The talk will provide a detailed analysis of how these methods impact model convergence and performance, offering insights into their applicability in real-world scenarios where robustness is paramount.

Bio

Diksha is a postdoc researcher at INRIA center at Université Côte d'Azur. Before INRIA, she was a research scientist at IBM Research, Singapore Lab. Her current work focuses on communication-efficiency in federated learning. Previously, he has worked on compression of transformer based computer vision models and anomaly detection in acoustic data. Before IBM, she was a research fellow at the National University of Singapore. She obtained her Ph.D. with distinction from the University of New Mexico, USA.

In case you want to meet Diksha, please contact Nirupam Gupta at nigu@di.ku.dk

Join the DeLTA community

You can subscribe to the DeLTA Seminar mailing list by sending an empty email to delta-seminar-join@list.ku.dk<mailto:delta-seminar-join@list.ku.dk>
DeLTA online calendar
DeLTA Lab page