Journal papers
-
Adil Salim, Laurent Condat, Konstantin Mishchenko and Peter Richtárik, “Dualize, Split, Randomize: Fast Nonsmooth Optimization Algorithms”, to appear in Journal of Optimization Theory and Applications, April 2020.
-
Adil Salim, A Strong Law of Large Numbers for Random Monotone Operators, October 2019.
-
Pascal Bianchi, Walid Hachem and Adil Salim, A Fully Stochastic Primal-Dual Algorithm, Optimization Letters, June 2020.
-
Adil Salim, Pascal Bianchi, and Walid Hachem, Snake: a Stochastic Proximal Gradient Algorithm for Regularized Problems over Large Graphs, Transaction on Automatic Control, March 2018. (Here is the Code and a short version presented at NIPS 2017 Workshop Black in AI)
-
Pascal Bianchi, Walid Hachem, and Adil Salim, A constant step Forward-Backward algorithm involving random maximal monotone operators, Journal of Convex Analysis, March 2018. (Here is an extended abstract)
-
Pascal Bianchi, Walid Hachem, and Adil Salim, Constant Step Stochastic Approximations Involving Differential Inclusions: Stability, Long-Run Convergence and Applications, Stochastics, May 2018.
Conference papers
-
Yongxin Chen, Sinho Chewi, Adil Salim, Andre Wibisono, “Improved analysis for a proximal algorithm for sampling”, COLT 2022.
-
Krishnakumar Balasubramanian, Sinho Chewi, Murat A. Erdogdu, Adil Salim, Matthew Zhang, “Towards a Theory of Non-Log-Concave Sampling: First-Order Stationarity Guarantees for Langevin Monte Carlo”, COLT 2022.
-
Adil Salim, Lukang Sun, Peter Richtárik, “Complexity Analysis of Stein Variational Gradient Descent Under Talagrand’s Inequality T1”, ICML 2022.
-
Adil Salim, Laurent Condat, Dmitry Kovalev and Peter Richtárik, “An Optimal Algorithm for Strongly Convex Minimization under Affine Constraints”, AISTATS 2022.
-
Dmitry Kovalev, Adil Salim and Peter Richtárik, “Optimal and Practical Algorithms for Smooth and Strongly Convex Decentralized Optimization”, NeurIPS 2020.
-
Anna Korba, Adil Salim, Michael Arbel, Giulia Luise and Arthur Gretton, “A Non-Asymptotic Analysis for Stein Variational Gradient Descent”, NeurIPS 2020.
-
Adil Salim and Peter Richtárik, “Primal Dual Interpretation of the Proximal Stochastic Gradient Langevin Algorithm”, NeurIPS 2020.
-
Adil Salim, Anna Korba and Giulia Luise, “The Wasserstein Proximal Gradient Algorithm”, NeurIPS 2020.
-
Sélim Chraibi, Ahmed Khaled, Dmitry Kovalev, Adil Salim, Peter Richtárik and Martin Takáč, “Distributed Fixed Point Methods with Compressed Iterates”, December 2019.
-
Sélim Chraibi, Adil Salim, Samuel Horváth, Filip Hanzely and Peter Richtárik, “Learning To Optimize Via Dual Space Preconditioning”, Technical report, September 2019.
-
Adil Salim, Dmitry Kovalev and Peter Richtárik, “Stochastic Proximal Langevin Algorithm: Potential Splitting and Nonasymptotic Rates”, Spotlight @ NeurIPS 2019, Vancouver, Canada. (Here is the slides of the Spotlight, the Code and the Poster).
-
Michael Arbel, Anna Korba, Adil Salim and Arthur Gretton, “Maximum Mean Discrepancy Gradient Flow”, NeurIPS 2019, Vancouver, Canada.
-
Anna Korba, Adil Salim, Michael Arbel and Arthur Gretton, “Yet another look at Stein Variational Gradient Descent”, ICML 2019 Workshop on Stein’s Method, Long Beach, USA.
-
Adil Salim and Walid Hachem, “On the Performance of the Stochastic FISTA”, March 2019.
-
Sholom Schechtman, Adil Salim, and Pascal Bianchi, “Passty Langevin”, CAp 2019, Toulouse, France.
-
Adil Salim, Pascal Bianchi, and Walid Hachem, “A Constant Step Stochastic Douglas-Rachford Algorithm with Application to Non Separable Regularization”, IEEE ICASSP 2018, Calgary, Canada. (Here is the Technical Report, an extended abstract of the Technical Report and the Code)
-
Adil Salim, Pascal Bianchi, and Walid Hachem, “Snake: a Stochastic Proximal Gradient Algorithm for Regularized Problems over Large Graphs”, CAp 2017, Grenoble, France.
-
Pascal Bianchi, Walid Hachem, and Adil Salim, “Convergence d’un algorithme du gradient proximal stochastique à pas constant et généralisation aux opérateurs monotones aléatoires”, GRETSI 2017, Juan-les-Pins, France.
-
Adil Salim, Pascal Bianchi, and Walid Hachem, “A Stochastic Proximal Point Algorithm for Total Variation Regularization over Large Scale Graphs”, IEEE CDC 2016, Las Vegas, USA.
-
Rahul Mourya, Pascal Bianchi, Adil Salim and Cédric Richard, “An Adaptive Distributed Asynchronous Algorithm with Application to Target Localization”, IEEE CAMSAP 2017, Curacao, Dutch Antilles.
-
Pascal Bianchi, Walid Hachem and Adil Salim, “Building Stochastic Optimization Algorithms with Random Monotone Operators”, EUCCO 2016, Leuven, Belgium.
Reports
-
PhD thesis, Random monotone operators and application to stochastic optimization, under the supervision of Pascal Bianchi and Walid Hachem
-
Master’s thesis, Processus à accroissements libres, under the supervision of Philippe Biane
-
Intership report, Méthodes séquentielles de traitement de données, under the supervision of Florian Maire
Service
I have served as an area chair for:
- Black in AI workshop 2019, 2020
I have reviewed papers for:
- Journal of Machine Learning Research
- Journal of the Royal Statistical Society: Series B
- NeurIPS 2019, 2020
- ICML 2020
- ICLR 2021
- Set-Valued and Variational Analysis
- Applied Mathematics and Optimization
- IEEE Transactions on Information Theory
- IEEE Transactions on Signal and Information Processing over Networks
- IEEE Transactions on Signal Processing
- IEEE Signal Processing Letters
- Automatica
- Numerical Algorithms
- Journal of Mathematical Analysis and Applications
- Journal of Scientific Computing
Awards and Distinctions
- 2020 Top 33% ICML reviewer
- 2019 Top 50% NeurIPS reviewer
- 2019 Participation to MLSS 2019, Skoltech Moscow, Russia
- 2019 Participation to DS3 2019, Ecole Polytechnique, France
- 2018 GDR ISIS Travel Grant for PhD mobility at EPFL
- 2017, 2018 NIPS Workshop Black in AI Travel Grant
- 2015 Top 3 Master’s thesis out of nearly 200 students
- 2009 - 2012 Merit Scholarship