I am a Senior Researcher in the Machine Learning Foundations group at Microsoft Research (Redmond, USA). Previously, I was a Google Research Fellow at the Simons Institute, UC Berkeley, USA.
I did my Ph.D at Telecom Paris and Paris–Saclay University, France, under the supervision of Pascal Bianchi and Walid Hachem. Then, I did a postdoc at KAUST, Saudi Arabia, hosted by Peter Richtárik.
I use optimization, statistics, optimal transport, convex analysis etc. to study machine learning algorithms. I received the Masters degrees in 2015 from ENSAE Paris, where I studied statistics, and from Paris–Saclay University, where I studied probability theory. Here is my CV.
News
- Sinho Chewi is spending the summer with us at MSR.
- Anna Korba and I will give a tutorial on Sampling as Optimization at ICML 2022.
Selected papers
Sampling and optimal transport
-
Yongxin Chen, Sinho Chewi, Adil Salim, Andre Wibisono, “Improved analysis for a proximal algorithm for sampling”, February 2022.
-
Adil Salim, Lukang Sun, Peter Richtárik, “Complexity Analysis of Stein Variational Gradient Descent Under Talagrand’s Inequality T1”, June 2021.
-
Anna Korba, Adil Salim, Michael Arbel, Giulia Luise and Arthur Gretton, “A Non-Asymptotic Analysis for Stein Variational Gradient Descent”, NeurIPS 2020.
-
Adil Salim and Peter Richtárik, “Primal Dual Interpretation of the Proximal Stochastic Gradient Langevin Algorithm”, NeurIPS 2020.
Optimization and monotone operators
-
Adil Salim, Laurent Condat, Dmitry Kovalev and Peter Richtárik, “An Optimal Algorithm for Strongly Convex Minimization under Affine Constraints”, AISTATS 2022.
-
Dmitry Kovalev, Adil Salim and Peter Richtárik, “Optimal and Practical Algorithms for Smooth and Strongly Convex Decentralized Optimization”, NeurIPS 2020.