Registration for this course is open until Monday, 18.08.2025 23:59.

News

Information and Survey

Written on 10.07.25 (last change on 10.07.25) by Sebastian Stich

Dear students,

We are happy to announce that "Lectures on Modern Optimization Methods" will be offered as a block course this summer.


📄 Course Information

The course webpage now provides a detailed overview of the course format, expected workload, and covered topics.


📝 Please fill out… Read more

Dear students,

We are happy to announce that "Lectures on Modern Optimization Methods" will be offered as a block course this summer.


📄 Course Information

The course webpage now provides a detailed overview of the course format, expected workload, and covered topics.


📝 Please fill out our short survey

To help us plan the course and finalize the exam format (oral vs. written), we kindly ask you to complete this short survey:

🔗 Course & Exam Participation Survey
Deadline: End of next week (July 18)
(Note: This is not a binding registration, but accurate numbers are very helpful.)


💬 Zoom Info Session

We will also hold a short information session to answer any questions:

📅 Tuesday, July 15
🕔 17:00–17:30 (5:00–5:30 PM)
🔗 Zoom link is posted on the course website

Lectures on Modern Optimization Methods

In this course, we will explore a range of advanced optimization techniques, focusing on essential tools and concepts important for today's data science and machine learning tasks. Our goal is to give students a clear overview and understanding of these techniques, so they can apply them in designing algorithms and conducting their own research projects.

We will cover these techniques with a strong emphasis on the mathematical details, helping students understand the key parts of the proofs involved. This understanding is crucial for using these tools effectively in complex situations.

For example, in many learning algorithms, choosing hyperparameters like step sizes is a crucial task. There are principled approaches for optimal step size tuning in convex problems, which inspire the creation of adaptive learning rate methods. These methods are now widely used in training deep learning models.

In the scope of the course project and the exercises, will also look at how these techniques extend to non-convex optimization and their connections to deep learning optimization.

Lecturers: Dr. Anton Rodomanov, Dr. Sebastian Stich


More details on the course content will be revealed in due time. Interested students are encouraged to register to receive updates by email.

Please note that the course will be conducted in a block format during the summer break, providing an intensive learning experience. Mornings will be dedicated to lectures, where you will explore theoretical insights and innovative strategies. Afternoons will focus on practical exercises and project work, allowing you to apply what you have learned and deepen your understanding through hands-on experience.

This course is designed to complement the existing curriculum at UdS, making it an ideal follow-up for students who have already participated in other optimization courses (such as Continuous OptimizationConvex Analysis and OptimizationOptimization for Machine Learning, Non-smooth Analysis and Optimization in Data Science, etc.). While the course is self-contained, familiarity with key concepts such as convexity and gradient descent is beneficial, as catching up with fundamental concepts during the intensive course will be challenging. To support all participants, we will offer a kick-off session before the block course, providing a detailed review of background material and opportunity to refresh or catch up on essential topics.


Important Dates

Kick-off: July 15, 5pm

Block Course: 21. Aug 2025 - 5. Sept. 2025

Exam date: TBD

The lectures will be complemented by tutorial sessions, which will include exercises and project work. While the exercises will not be graded, there will be certain mandatory components necessary for exam admission, such as, e.g. submitting the course project. Further details on these requirements will be clarified later.

Privacy Policy | Legal Notice
If you encounter technical problems, please contact the administrators.