News
Currently, no news are available
Lectures on Modern Optimization Methods
In this course, we will explore a range of advanced optimization techniques, focusing on essential tools and concepts important for today's data science and machine learning tasks. Our goal is to give students a clear overview and understanding of these techniques, so they can apply them in designing algorithms and conducting their own research projects.
We will cover these techniques with a strong emphasis on the mathematical details, helping students understand the key parts of the proofs involved. This understanding is crucial for using these tools effectively in complex situations.
For example, in many learning algorithms, choosing hyperparameters like step sizes is a crucial task. There are principled approaches for optimal step size tuning in convex problems, which inspire the creation of adaptive learning rate methods. These methods are now widely used in training deep learning models.
In the scope of the course project and the exercises, will also look at how these techniques extend to non-convex optimization and their connections to deep learning optimization.
Lecturers: Dr. Anton Rodomanov, Dr. Sebastian Stich
More details on the course content will be revealed in due time. Interested students are encouraged to register to receive updates by email.
Please note that the course will be conducted in a block format during the summer break, providing an intensive learning experience. Mornings will be dedicated to lectures, where you will explore theoretical insights and innovative strategies. Afternoons will focus on practical exercises and project work, allowing you to apply what you have learned and deepen your understanding through hands-on experience.
This course is designed to complement the existing curriculum at UdS, making it an ideal follow-up for students who have already participated in other optimization courses (such as Continuous Optimization, Convex Analysis and Optimization, Optimization for Machine Learning, Non-smooth Analysis and Optimization in Data Science, etc.). While the course is self-contained, familiarity with key concepts such as convexity and gradient descent is beneficial, as catching up with fundamental concepts during the intensive course will be challenging. To support all participants, we will offer a kick-off session before the block course, providing a detailed review of background material and opportunity to refresh or catch up on essential topics.
Important Dates
Kick-off: TBD
Block Course: 25. Aug 2025 - 5. Sept. 2025
Exam date: TBD
The lectures will be complemented by tutorial sessions, which will include exercises and project work. While the exercises will not be graded, there will be certain mandatory components necessary for exam admission, such as, e.g. submitting the course project. Further details on these requirements will be clarified later.