I am a Senior Researcher in the Machine Learning Foundations group at Microsoft Research (Redmond, USA). Previously, I was a Google Research Fellow at the Simons Institute, UC Berkeley, USA.
I did my Ph.D at Telecom Paris and Paris–Saclay University, France, under the supervision of Pascal Bianchi and Walid Hachem. Then, I did a postdoc at KAUST, Saudi Arabia, hosted by Peter Richtárik.
I use optimization, statistics, optimal transport, convex analysis etc. to study machine learning algorithms. I received the Masters degrees in 2015 from ENSAE Paris, where I studied statistics, and from Paris–Saclay University, where I studied probability theory. Here is my CV.
News
-
Anna Korba and I have presented a tutorial on Sampling as Optimization at ICML 2022. Here are our slides. You can watch the video here.
-
Sinho Chewi is writing a book on Sampling.
Some recent papers
Sampling and optimal transport
-
Sitan Chen, Sinho Chewi, Jerry Li, Yuanzhi Li, Adil Salim and Anru R. Zhang, “Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions”, Notable top 5% paper @ ICLR 2023.
-
Yongxin Chen, Sinho Chewi, Adil Salim, Andre Wibisono, “Improved analysis for a proximal algorithm for sampling”, COLT 2022.
-
Adil Salim, Lukang Sun, Peter Richtárik, “A Convergence Theory for SVGD in the Population Limit under Talagrand’s Inequality T1”, ICML 2022.
Optimization and monotone operators
-
Michael Diao, Krishnakumar Balasubramanian, Sinho Chewi, Adil Salim, “Forward-Backward Gaussian Variational Inference via JKO in the Bures–Wasserstein Space”, ICML 2023.
-
Sinho Chewi, Sébastien Bubeck and Adil Salim, “On the complexity of finding stationary points of smooth functions in one dimension”, Best student paper award @ ALT 2023.
-
Adil Salim, Laurent Condat, Dmitry Kovalev and Peter Richtárik, “An Optimal Algorithm for Strongly Convex Minimization under Affine Constraints”, AISTATS 2022.