Wuchen Li

Assistant professor in Mathematics at University of South Carolina.

Google Scholar

Colloquia and RTG data science seminars.

Optimal transport and Mean field games workshop series



Recent News

Draft "Transport f divergences" is online. We define a general class of divergences to measure differences between probability density functions in one-dimensional sample space. The construction is based on the convex function with the Jacobi operator of mapping function that pushforwards one density to the other. We call these information measures transport f-divergences. We present several properties of transport f-divergences, including invariances, convexities, variational formulations, and Taylor expansions in terms of mapping functions. Examples of transport f-divergences in generative models are provided. April 23, 2025.

Draft "Transport alpha divergences" is online. We derive a particular class of divergences measuring the difference between probability density functions on a one-dimensional sample space. This divergence is a one-parameter variation of the Ito-Sauda divergence between quantile density functions. We prove that the proposed divergence is one-parameter variation of transport Kullback-Leibler divergence and Hessian distance of negative Boltzmann entropy with respect to Wasserstein-2 metric. From Taylor expansions, we also formulate the 3-symmetric tensor in Wasserstein space, which is given by an iterative Gamma three operators. The alpha-geodesic on Wasserstein space is also derived. From these properties, we name the proposed information measures transport alpha divergences. We provide several examples of transport alpha divergences for generative models in machine learning applications. April 18, 2025.

Draft "Accelerated Stein Variational Gradient Flow" is online. Stein variational gradient descent (SVGD) is a kernel-based particle method for sampling from a target distribution, e.g., in genera- tive modeling and Bayesian inference. SVGD does not require estimat- ing the gradient of the log-density, which is called score estimation. In practice, SVGD can be slow compared to score-estimation based sam- pling algorithms. To design fast and efficient high-dimensional sampling algorithms, we introduce ASVGD, an accelerated SVGD, based on an accelerated gradient flow in a metric space of probability densities follow- ing Nesterov’s method. We then derive a momentum-based discrete-time sampling algorithm, which evolves a set of particles deterministically. To stabilize the particles’ momentum update, we also study a Wasser- stein metric regularization. For the generalized bilinear kernel and the Gaussian kernel, toy numerical examples with varied target distributions demonstrate the effectiveness of ASVGD compared to SVGD and other popular sampling methods. April 1, 2025.

Draft "Variational conditional normalizing flows for computing second-order mean field control problems" is online. Mean field control (MFC) problems have vast applications in artificial intelligence, engineering, and economics, while solving MFC problems accurately and efficiently in high-dimensional spaces remains challenging. This work introduces variational conditional normalizing flow (VCNF), a neural network-based variational algorithm for solving general MFC problems based on flow maps. Formulating MFC problems as optimal control of Fokker--Planck (FP) equations with suitable constraints and cost functionals, we use VCNF to model the Lagrangian formulation of the MFC problems. In particular, VCNF builds upon conditional normalizing flows and neural spline flows, allowing efficient calculations of the inverse push-forward maps and score functions in MFC problems. We demonstrate the effectiveness of VCNF through extensive numerical examples, including optimal transport, regularized Wasserstein proximal operators, and flow matching problems for FP equations. March 25, 2025.

Draft "Splitting Regularized Wasserstein Proximal Algorithms for Nonsmooth Sampling Problems" is online. Sampling from nonsmooth target probability distributions is essential in various applications, including the Bayesian Lasso. We propose a splitting-based sampling algorithm for the time-implicit discretization of the probability flow for the Fokker-Planck equation, where the score function defined as the gradient logarithm of the current probability density function, is approximated by the regularized Wasserstein proximal. When the prior distribution is the Laplace prior, our algorithm is explicitly formulated as a deterministic interacting particle system, incorporating softmax operators and shrinkage operations to efficiently compute the gradient drift vector field and the score function. The proposed formulation introduces a particular class of attention layers in transformer structures, which can sample sparse target distributions. We verify the convergence towards target distributions regarding Renyi divergences under suitable conditions. Numerical experiments in high-dimensional nonsmooth sampling problems, such as sampling from mixed Gaussian and Laplace distributions, logistic regressions, image restoration with L1-TV regularization, and Bayesian neural networks, demonstrate the efficiency and robust performance of the proposed method. Feb 23, 2025.

Draft "Geometric calculations on density manifolds from reciprocal relations in hydrodynamics" is online. Hydrodynamics are systems of equations describing the evolution of macroscopic states in non-equilibrium thermodynamics. From generalized Onsager reciprocal relationships, one can formulate a class of hydrodynamics as gradient flows of free energies. In recent years, Onsager gradient flows have been widely investigated in optimal transport-type metric spaces with nonlinear mobilities, namely hydrodynamical density manifolds. This paper studies geometric calculations in these hydrodynamical density manifolds. We first formulate Levi-Civita connections, gradient, Hessian, and parallel transport, and then derive Riemannian and sectional curvatures on density manifolds. We last present closed formulas for sectional curvatures of density manifolds in one dimensional spaces, in which the sign of curvatures is characterized by the convexities of mobilities. In examples, we present density manifolds and their sectional curvatures in zero range models, such as independent particles, simple exclusion processes, and Kipnis-Marchioro-Presutti models. Jan 27, 2025.

Previous results

Research Interests

Transport information geometry in Complex Dynamical systems, PDEs, Statistics, Optimization, Control and Games, Mathematical Data science, Graphs and Neural networks, and Scientific Computations.







Recent Publications