View a PDF of the paper titled Constrained Sampling with Primal-Dual Langevin Monte Carlo, by Luiz F. O. Chamon and Mohammad Reza Karimi and Anna Korba
Abstract:This work considers the problem of sampling from a probability distribution known up to a normalization constant while satisfying a set of statistical constraints specified by the expected values of general nonlinear functions. This problem finds applications in, e.g., Bayesian inference, where it can constrain moments to evaluate counterfactual scenarios or enforce desiderata such as prediction fairness. Methods developed to handle support constraints, such as those based on mirror maps, barriers, and penalties, are not suited for this task. This work therefore relies on gradient descent-ascent dynamics in Wasserstein space to put forward a discrete-time primal-dual Langevin Monte Carlo algorithm (PD-LMC) that simultaneously constrains the target distribution and samples from it. We analyze the convergence of PD-LMC under standard assumptions on the target distribution and constraints, namely (strong) convexity and log-Sobolev inequalities. To do so, we bring classical optimization arguments for saddle-point algorithms to the geometry of Wasserstein space. We illustrate the relevance and effectiveness of PD-LMC in several applications.
Submission history
From: Luiz F. O. Chamon [view email]
[v1]
Fri, 1 Nov 2024 13:26:13 UTC (3,645 KB)
[v2]
Tue, 7 Jan 2025 17:36:14 UTC (3,653 KB)
Source link
lol