Journal papers
-
Krishnakumar Balasubramanian, Larry Goldstein, Nathan Ross and Adil Salim, Gaussian random field approximation via Stein’s method with applications to wide random neural networks, Applied and Computational Harmonic Analysis journal, 2024.
-
Adil Salim, A Strong Law of Large Numbers for Random Monotone Operators, Set-Valued and Variational Analysis, 2023.
-
Adil Salim, Laurent Condat, Konstantin Mishchenko and Peter Richtárik, “Dualize, Split, Randomize: Fast Nonsmooth Optimization Algorithms”, Journal of Optimization Theory and Applications, April 2020.
-
Pascal Bianchi, Walid Hachem and Adil Salim, A Fully Stochastic Primal-Dual Algorithm, Optimization Letters, June 2020.
-
Adil Salim, Pascal Bianchi, and Walid Hachem, Snake: a Stochastic Proximal Gradient Algorithm for Regularized Problems over Large Graphs, Transaction on Automatic Control, March 2018. (Here is the Code and a short version presented at NIPS 2017 Workshop Black in AI)
-
Pascal Bianchi, Walid Hachem, and Adil Salim, A constant step Forward-Backward algorithm involving random maximal monotone operators, Journal of Convex Analysis, March 2018. (Here is an extended abstract)
-
Pascal Bianchi, Walid Hachem, and Adil Salim, Constant Step Stochastic Approximations Involving Differential Inclusions: Stability, Long-Run Convergence and Applications, Stochastics, May 2018.
Conference papers
-
Victor Priser, Pascal Bianchi and Adil Salim, “Long-time asymptotics of noisy SVGD outside the population limit”, June 2024.
-
Marah Abdin et al., “Phi-3 Technical Report”, April 2024.
-
Suriya Gunasekar, Yi Zhang et al., “Textbooks Are All You Need”, June 2023.
-
Sitan Chen, Sinho Chewi, Holden Lee, Yuanzhi Li, Jianfeng Lu and Adil Salim, “The probability flow ODE is provably fast”, NeurIPS 2023.
-
Michael Diao, Krishnakumar Balasubramanian, Sinho Chewi, Adil Salim, “Forward-Backward Gaussian Variational Inference via JKO in the Bures–Wasserstein Space”, ICML 2023.
-
Sitan Chen, Sinho Chewi, Jerry Li, Yuanzhi Li, Adil Salim and Anru R. Zhang, “Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions”, Notable top 5% paper @ ICLR 2023, Kigali, Rwanda.
-
Sinho Chewi, Sébastien Bubeck and Adil Salim, “On the complexity of finding stationary points of smooth functions in one dimension”, Best student paper award @ ALT 2023, Singapore.
-
Lukang Sun, Adil Salim and Peter Richtárik, “Federated Sampling with Langevin Algorithm under Isoperimetry”, TMLR 2023.
-
Yongxin Chen, Sinho Chewi, Adil Salim, Andre Wibisono, “Improved analysis for a proximal algorithm for sampling”, COLT 2022, London, UK.
-
Krishnakumar Balasubramanian, Sinho Chewi, Murat A. Erdogdu, Adil Salim, Matthew Zhang, “Towards a Theory of Non-Log-Concave Sampling: First-Order Stationarity Guarantees for Langevin Monte Carlo”, COLT 2022, London, UK.
-
Adil Salim, Lukang Sun, Peter Richtárik, “A Convergence Theory for SVGD in the Population Limit under Talagrand’s Inequality T1”, ICML 2022, Baltimore, USA.
-
Adil Salim, Laurent Condat, Dmitry Kovalev and Peter Richtárik, “An Optimal Algorithm for Strongly Convex Minimization under Affine Constraints”, AISTATS 2022.
-
Dmitry Kovalev, Adil Salim and Peter Richtárik, “Optimal and Practical Algorithms for Smooth and Strongly Convex Decentralized Optimization”, NeurIPS 2020.
-
Anna Korba, Adil Salim, Michael Arbel, Giulia Luise and Arthur Gretton, “A Non-Asymptotic Analysis for Stein Variational Gradient Descent”, NeurIPS 2020.
-
Adil Salim and Peter Richtárik, “Primal Dual Interpretation of the Proximal Stochastic Gradient Langevin Algorithm”, NeurIPS 2020.
-
Adil Salim, Anna Korba and Giulia Luise, “The Wasserstein Proximal Gradient Algorithm”, NeurIPS 2020.
-
Sélim Chraibi, Ahmed Khaled, Dmitry Kovalev, Adil Salim, Peter Richtárik and Martin Takáč, “Distributed Fixed Point Methods with Compressed Iterates”, December 2019.
-
Sélim Chraibi, Adil Salim, Samuel Horváth, Filip Hanzely and Peter Richtárik, “Learning To Optimize Via Dual Space Preconditioning”, Technical report, September 2019.
-
Adil Salim, Dmitry Kovalev and Peter Richtárik, “Stochastic Proximal Langevin Algorithm: Potential Splitting and Nonasymptotic Rates”, Spotlight @ NeurIPS 2019, Vancouver, Canada. (Here is the slides of the Spotlight, the Code and the Poster).
-
Michael Arbel, Anna Korba, Adil Salim and Arthur Gretton, “Maximum Mean Discrepancy Gradient Flow”, NeurIPS 2019, Vancouver, Canada.
-
Anna Korba, Adil Salim, Michael Arbel and Arthur Gretton, “Yet another look at Stein Variational Gradient Descent”, ICML 2019 Workshop on Stein’s Method, Long Beach, USA.
-
Adil Salim and Walid Hachem, “On the Performance of the Stochastic FISTA”, March 2019.
-
Sholom Schechtman, Adil Salim, and Pascal Bianchi, “Passty Langevin”, CAp 2019, Toulouse, France.
-
Adil Salim, Pascal Bianchi, and Walid Hachem, “A Constant Step Stochastic Douglas-Rachford Algorithm with Application to Non Separable Regularization”, IEEE ICASSP 2018, Calgary, Canada. (Here is the Technical Report, an extended abstract of the Technical Report and the Code)
-
Adil Salim, Pascal Bianchi, and Walid Hachem, “Snake: a Stochastic Proximal Gradient Algorithm for Regularized Problems over Large Graphs”, CAp 2017, Grenoble, France.
-
Pascal Bianchi, Walid Hachem, and Adil Salim, “Convergence d’un algorithme du gradient proximal stochastique à pas constant et généralisation aux opérateurs monotones aléatoires”, GRETSI 2017, Juan-les-Pins, France.
-
Adil Salim, Pascal Bianchi, and Walid Hachem, “A Stochastic Proximal Point Algorithm for Total Variation Regularization over Large Scale Graphs”, IEEE CDC 2016, Las Vegas, USA.
-
Rahul Mourya, Pascal Bianchi, Adil Salim and Cédric Richard, “An Adaptive Distributed Asynchronous Algorithm with Application to Target Localization”, IEEE CAMSAP 2017, Curacao, Dutch Antilles.
-
Pascal Bianchi, Walid Hachem and Adil Salim, “Building Stochastic Optimization Algorithms with Random Monotone Operators”, EUCCO 2016, Leuven, Belgium.
Reports
-
PhD thesis, Random monotone operators and application to stochastic optimization, under the supervision of Pascal Bianchi and Walid Hachem
-
Master’s thesis, Processus à accroissements libres, under the supervision of Philippe Biane
-
Intership report, Méthodes séquentielles de traitement de données, under the supervision of Florian Maire