Selected talks
-
A Stochastic Proximal Point Algorithm for Total Variation Regularization over Large Scale Graphs, Conference on Decision and Control, December 2016, Las Vegas, USA
-
Stochastic Proximal Gradient algorithm, France / Japan Machine Learning Workshop, September 2017, Ecole Normale Supérieure Paris, France
-
Distributed Douglas Rachford algorithm, ANR ODISSEE meeting, November 2017, Nice Sophia Antipolis University, France
-
Snake, CAp 2017, June 2017, IMAG Grenoble, France
-
Snake: a Stochastic Proximal Gradient Algorithm for Regularized Problems over Large Graphs, GdR ISIS meeting, February 2018, Télécom Paris, France
-
A stochastic Forward Backward algorithm with application to large graphs regularization, Machine Learning Optimization seminar, March 2018, Ecole Polytechnique Fédérale de Lausanne, Switzerland
-
A Splitting Algorithm for Minimization under Stochastic Linear Constraints, ISMP, July 2018, Bordeaux, France
-
Soutenance de thèse, November 2018, Télécom Paris, France
-
Sampling as Convex Optimization, Guest Lecture on Optimization for Machine Learning, March 2019, KAUST, KSA
-
Stochastic Chambolle-Pock, Visual Computing Center showcase, April 2019, KAUST, KSA
-
Exponential Convergence Time of Gradient Descent for One-Dimensional Deep Linear Neural Networks, Mathematics of Deep Learning seminar, May 2019, KAUST, KSA
-
On Stochastic Primal–Dual Algorithms, ICCOPT, August 2019, TU Berlin, Germany
-
Langevin as an Optimization algorithm, Computer Science Graduate Seminar, November 2019, KAUST, KSA
-
Stochastic Proximal Langevin Algorithm, NeurIPS, December 2019, Vancouver, Canada
-
Primal Dual Interpretation of the Proximal Stochastic Gradient Langevin Algorithm, Second Symposium on Machine Learning and Dynamical Systems, Fields Institute, Toronto, Canada (ONLINE).