Outline
Professor: Alain Trouvé
Teaching Assistants: Xavier Fontaine, Laure Quivy, Miguel Colom
, Argyris Kalogeratos
Timetable: tuesday 9a.m to 1p.m
In this course we present an introductory course to the field of optimization. Classic algorithms such as Gradient Descent, Optimal Gradient Descent and Newton Methods will be presented. Optimization under constraints (equality and inequality constraints) will be discussed. Theoretical results about optimization of non-smooth functionals will be presented using sub-gradients and the Fenchel transformation. These notions will be illustrated with exercises and pratical work with applications in image processing.
References
- Convex Optimization (Boyd)
- Introduction à l’Analyse Numérique Matricielle (Ciarlet)
- Numerical Optimization (theoretical and practical aspects) (Bonnans et al.)
- Nonlinear Programming (Bertsekas)
- Introductory Lectures on Convex Programming (Nesterov)
- I’m a bandit (Bubeck)
- Convex Analysis and Monotone Operator Theory in Hilbert Spaces (Bauschke & Combettes)
- Optimization. Application in image processing (Nikolova)
- Proximal Algorithms (Parikh & Boyd)
- Convex Analysis and Minimization Algorithms (Hiriart-Urruty & Lemaréchal)