I work at an early-stage startup. Previously, I was a senior researcher at Microsoft Research. And before that, I was a Google research fellow at the Simons Institute (UC Berkeley).
I work on reasoning with language models (especially the Phi series of language models) and diffusion models theory. Before, my research was mostly focused on sampling, optimization and proximal methods. Here is a paper at the intersection of these topics.
News
-
I will talk about diffusion models at BIRS, Canada and IMS, Singapore. Thanks for the invitations!
-
I am in Senegal for MLSS 2025. The energy is awesome!!!
-
I am a workshop chair at NeurIPS 2024.
-
Anna Korba and I have presented a tutorial on Sampling as Optimization at ICML 2022. Here are our slides. You can watch the video here.
Some papers
LLMs
-
Adil Salim, “Accelerating mathematical research with language models: A case study of an interaction with GPT-5-Pro on a convex analysis problem”, October 2025.
-
Marah Abdin et al., “Phi-4 Technical Report”, December 2024.
-
Marah Abdin et al., “Phi-3 Technical Report”, April 2024.
-
Suriya Gunasekar, Yi Zhang et al., “Textbooks Are All You Need”, June 2023.
Diffusion models
-
Khashayar Gatmiry, Sitan Chen and Adil Salim, “High-accuracy and dimension-free sampling with diffusions”, January 2026.
-
Sitan Chen, Sinho Chewi, Holden Lee, Yuanzhi Li, Jianfeng Lu and Adil Salim, “The probability flow ODE is provably fast”, NeurIPS 2023.
-
Sitan Chen, Sinho Chewi, Jerry Li, Yuanzhi Li, Adil Salim and Anru R. Zhang, “Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions”, Notable top 5% paper @ ICLR 2023, Kigali, Rwanda.