logo IMB
Retour

Séminaire Images Optimisation et Probabilités

(Maths-ia) Stochastic implicit (proximal) methods and variance reduction

Cheik Traoré

( TSE )

Salle de conférences

08 janvier 2026 à 11:15

Stochastic algorithms, particularly stochastic gradient descent (SGD), have become the preferred methods in data science and machine learning. SGD is indeed efficient for large-scale problems. However, due to its variance, its convergence properties are unsatisfactory. This issue has been addressed by variance reduction techniques such as SVRG and SAGA. Recently, the stochastic proximal point algorithm (SPPA) emerged as an alternative and was shown to be more robust than SGD with respect to step size settings. In this talk, we will examine the SPPA algorithm. Specifically, we will demonstrate how variance reduction techniques can improve the convergence rates of stochastic proximal point methods, as has already been demonstrated for SGD.