News
Exam September 3, GHH, 14.00-16.30Written on 02.09.24 (last change on 02.09.24) by Sebastian Stich Dear Students, The exam tomorrow will take place in the building E2 2, room GHH (the big lecture hall). Time: 14:00 - 16:30h.
|
Q&A Session (Tuesday, Aug 27, 2pm)Written on 22.08.24 by Sebastian Stich Dear students, To answer any questions you might have about the lecture materials, we are offering a Q&A session next Tuesday at 14.00h. The session will take place in the CISPA building, in either room 0.01 or 0.05. You can also use the forum to post questions and to exchange with other… Read more Dear students, To answer any questions you might have about the lecture materials, we are offering a Q&A session next Tuesday at 14.00h. The session will take place in the CISPA building, in either room 0.01 or 0.05. You can also use the forum to post questions and to exchange with other students. As a reminder, the final exam is on September 3. You must register (on LSF) for the exam at least a week before. |
July 16 - Talk & Poster SessionWritten on 09.07.24 (last change on 09.07.24) by Sebastian Stich Dear students, On July 16, we will again meet in room 0.05, and have a guest talk on personalization in federated learning from 16:15 to 17:15, followed by the poster presentations from 17:15 to 18:00. Please send your poster by Monday morning (8am) to Yuan Gao (...) if you want us to print it… Read more Dear students, On July 16, we will again meet in room 0.05, and have a guest talk on personalization in federated learning from 16:15 to 17:15, followed by the poster presentations from 17:15 to 18:00. Please send your poster by Monday morning (8am) to Yuan Gao (...) if you want us to print it for you. Alternatively, you can print it yourself (see lecture slides). Feel free to share the invitation for the talk and the poster session with your friends and colleagues. ------------------- Title: Personalization Mitigates the Perils of Local SGD for Heterogeneous Distributed Learning Abstract: Local SGD or Federated Averaging is one of the most popular algorithms for large-scale distributed optimization, such as cross-device federated learning. However, it has been challenging to prove its efficacy against simpler algorithms such as mini-batch SGD. In this talk, we will discuss the limitations of the Local SGD algorithm in even simple convex problems and motivate a personalized variant of Local SGD. We will discuss new convergence guarantees for this personalized approach, highlighting its dependence on existing notions of data heterogeneity, and compare these guarantees to Local SGD. Our theoretical analysis reveals that in scenarios with low data heterogeneity, personalized Local SGD outperforms both pure local training on a single machine and local SGD/mini-batch SGD that produce a consensus model across all machines. This performance gain arises because the personalized approach avoids the fixed point discrepancy due to its local updates and can reduce the consensus error between machines to zero, even with a constant learning rate. We support our findings with experiments on distributed linear regression tasks with varying degrees of data heterogeneity. Bio: Kumar Kshitij Patel is a fifth-year PhD student at the Toyota Technological Institute at Chicago (TTIC), where Professors Nati Srebro and Lingxiao Wang advise him, and his research centers on optimization algorithms in practically relevant settings such as federated learning. He wants to understand the effects of limited communication budgets, data heterogeneity, sequential decision-making, and privacy considerations in such settings. To systematically understand these settings, Kshitij has characterized the min-max oracle complexity of optimization for several representative problem classes with varying heterogeneity. He has also explored the game-theoretic aspects of collaboration protocols to prevent the formation of inaccessible data silos and sustain collaborations despite strategic agent behavior as well as understand how heterogeneous distribution shifts can affect deployed machine learning models. Additionally, he is interested in designing better and more practical privacy defenses that go beyond traditional differential privacy techniques for large machine learning models, such as diffusion models, to prevent them from memorizing sensitive data. During the summer of 2023, Kshitij worked with the privacy-preserving machine learning team at Sony AI alongside Lingjuan Lyu. In the summer of 2020, he interned as an applied scientist with the CodeGuru team at Amazon Web Services. Before joining TTIC, Kshitij obtained his BTech in Computer Science and Engineering from the Indian Institute of Technology, Kanpur, where working with Professor Purushottam Kar on Bandit Learning algorithms. He also spent a year on academic exchange at École Polytechnique Fédérale de Lausanne (EPFL), working in the Machine Learning and Optimization Laboratory (MLO) with Professor Martin Jaggi. |
Tutorial this weekWritten on 08.07.24 by Yuan Gao Hi all, Please note that this week we will resume the tutorial and discuss exercise sheet 10 (the final exercise sheet) at our usual times. Best, Yuan |
Lecture updatesWritten on 10.06.24 by Sebastian Stich Dear Students
Dear Students
Sebastian |
Midterm ExamWritten on 07.05.24 by Sebastian Stich Dear students The midterm exam takes place on June 4 in the lecture slot (start: 16:15h, rooms 0.01 - 0.05 at CISPA). Please register on CMS if you want to take the exam. (Note: a registration on LSF is not needed for midterms). |
Additional Tutorial SessionWritten on 29.04.24 by Sebastian Stich In addition to the session on Tuesday afternoons, we will now offer an additional exercise session on Mondays from 13:15 to 14:00 on Zoom (starting next week). This session is primarily intended for students who cannot attend the Tuesday session, but everyone is welcome to join. The main goal of… Read more In addition to the session on Tuesday afternoons, we will now offer an additional exercise session on Mondays from 13:15 to 14:00 on Zoom (starting next week). This session is primarily intended for students who cannot attend the Tuesday session, but everyone is welcome to join. The main goal of these exercise sessions is to discuss exercises, material from the lecture, or any other questions you may have. If there are no questions, the session may end early, so we encourage everyone to join on time. |
First lecture todayWritten on 16.04.24 (last change on 16.04.24) by Sebastian Stich Welcome to the OPTML course 2024! The first lecture will be today at 4.15pm in the CISPA building, room 0.01. The lecture can also be followed on zoom: (link available to registered students). |