News

July 16 - Talk & Poster Session

Written on 09.07.2024 22:31 by Sebastian Stich

Dear students,

On July 16, we will again meet in room 0.05, and have a guest talk on personalization in federated learning from 16:15 to 17:15, followed by the poster presentations from 17:15 to 18:00.

Please send your poster by Monday morning (8am) to Yuan Gao (...) if you want us to print it for you. Alternatively, you can print it yourself (see lecture slides).

Feel free to share the invitation for the talk and the poster session with your friends and colleagues.

-------------------
Time and location: July 16, 16:15-17:15, CISPA C0 room 0.05

Title: Personalization Mitigates the Perils of Local SGD for Heterogeneous Distributed Learning

Abstract: Local SGD or Federated Averaging is one of the most popular algorithms for large-scale distributed optimization, such as cross-device federated learning. However, it has been challenging to prove its efficacy against simpler algorithms such as mini-batch SGD. In this talk, we will discuss the limitations of the Local SGD algorithm in even simple convex problems and motivate a personalized variant of Local SGD. We will discuss new convergence guarantees for this personalized approach, highlighting its dependence on existing notions of data heterogeneity, and compare these guarantees to Local SGD. Our theoretical analysis reveals that in scenarios with low data heterogeneity, personalized Local SGD outperforms both pure local training on a single machine and local SGD/mini-batch SGD that produce a consensus model across all machines. This performance gain arises because the personalized approach avoids the fixed point discrepancy due to its local updates and can reduce the consensus error between machines to zero, even with a constant learning rate. We support our findings with experiments on distributed linear regression tasks with varying degrees of data heterogeneity.

Bio: Kumar Kshitij Patel is a fifth-year PhD student at the Toyota Technological Institute at Chicago (TTIC), where Professors Nati Srebro and Lingxiao Wang advise him, and his research centers on optimization algorithms in practically relevant settings such as federated learning. He wants to understand the effects of limited communication budgets, data heterogeneity, sequential decision-making, and privacy considerations in such settings. To systematically understand these settings, Kshitij has characterized the min-max oracle complexity of optimization for several representative problem classes with varying heterogeneity. He has also explored the game-theoretic aspects of collaboration protocols to prevent the formation of inaccessible data silos and sustain collaborations despite strategic agent behavior as well as understand how heterogeneous distribution shifts can affect deployed machine learning models.  Additionally, he is interested in designing better and more practical privacy defenses that go beyond traditional differential privacy techniques for large machine learning models, such as diffusion models, to prevent them from memorizing sensitive data.  

During the summer of 2023, Kshitij worked with the privacy-preserving machine learning team at Sony AI alongside Lingjuan Lyu. In the summer of 2020, he interned as an applied scientist with the CodeGuru team at Amazon Web Services. Before joining TTIC, Kshitij obtained his BTech in Computer Science and Engineering from the Indian Institute of Technology, Kanpur, where working with Professor Purushottam Kar on Bandit Learning algorithms. He also spent a year on academic exchange at École Polytechnique Fédérale de Lausanne (EPFL), working in the Machine Learning and Optimization Laboratory (MLO) with Professor Martin Jaggi.

Privacy Policy | Legal Notice
If you encounter technical problems, please contact the administrators.