I am an associate professor in mathematics (tenured) at the KTH Royal Institute of Technology in Stockholm. See here for a short bio and my journey so far. I also write a non-research blog here. However, the update frequency depends on how busy I am at the moment.
Overall, I am interested in computational optimization and machine learning algorithms, motivated by principled applied mathematics, e.g., PDE, gradient flows, optimal transport, kernel methods.
Earlier in my career, I was interested in the robustness of optimization, control, and machine learning algorithms. That requires us to use computational optimization tools that can manipulate probability distributions, which are inherently infinite-dimensional. It led me to my current interests in mathematical foundations for machine learning and optimization over probability distributions, rooted in PDE, gradient flows, and optimal transport.
For example, in some of my previous works, I invented robust ML algorithms that can protect against distribution shifts using principled kernel methods. Those optimization algorithms have deep theoretical roots such as the analysis of PDEs. Following that, I dedicate my current research to interfacing computational algorithms in machine learning/optimization using PDE gradient flows and optimal transport. Recently, I became interested in the Hellinger geometry (a.k.a. Fisher-Rao), e.g., kernel methods and (Wasserstein-)Fisher-Rao, a.k.a. (spherical-) Hellinger-Kantorovich, gradient flows.
To get in touch, click the icon at the bottom of the page. There are sometimes delays in my response to emails, please be patient.
Upcoming events
- July 20 - 31, 2026. I will be giving a lecture series on “Computational Gradient Flows and Optimal Transport” at Peking University, at the School of Mathematical Sciences. The lecture series information is in this link.
- 24 May 2026 - 29 May 2026: SwissMAP Workshop on “Computational Optimization Meets Gradient Flows and Optimal Transport”. Organized by Niao He (ETH Zurich), Yifan Hu (EPFL), Daniel Kuhn (EPFL), Jia-Jie Zhu (WIAS Berlin)
Recent talks (selected)
- SIAM UQ 2026 Minisymposium, Gradient Flows for Uncertainty Quantification: New Algorithms and Applications
- EPFL Bernoulli workshop “Particles, Flows & Maps for Sampling Complex Distributions”, 2025. Video recording available here
- Gradient Flows Face-to-Face Workshop in Granada, Spain, 2025. Organizers: Maria Bruna, José Alfredo Cañizo, José Antonio Carrillo, Antonio Esposito. Slides available here
- Workshop on Bayesian Analysis and Artificial Intelligence, Peking University, Beijing, China, 2025
- Banff International Research Station (BIRS) “Mathematical Analysis of Adversarial Machine Learning” workshop in Oaxaca, Mexico, from August 17 to August 22, 2025
- The Kantorovich Initiative Seminar, University of British Columbia: Kernel Approximation of Wasserstein and Fisher-Rao Gradient flows
- Invited talk slides for the EPFL talk, Nov 2024
- Kernel Approximation of Wasserstein and Fisher-Rao Gradient flows by Prof. Jia-Jie (JJ) Zhu (WIAS Berlin) - EPFL
- From distributional ambiguity to gradient flows: Wasserstein, Fisher-Rao, and kernel approximation - EPFL
- Workshop on Optimal Transport and PDEs, Program on The Mathematics of Data, Institute for Mathematical Sciences, National University of Singapore, 2024. Organizers: Philippe Rigollet, Afonso Bandeira, Subhro Ghosh
Open positions
- KTH Master thesis: if you are a master’s student already enrolled in KTH and interested in optimization for machine learning, deep generative models, optimal transport, applications of PDE/SDEs, please feel free to reach out.
- [PhD position] If you are interested in joining my group at KTH, please feel free to inquire with a CV and all transcripts. I will read all inquires, but can only reply to those who fit with our group.
- Joint PhD position at TU Darmstadt/KTH Royal Institute of Technology (with Jan Peters). See the ad here
- Future positions will be posted. If you anticipate graduation within the next year or two, please feel free to inquire.
News and updates
Twitter feed
Tweets by jzhu