View a PDF of the paper titled AdjointDEIS: Efficient Gradients for Diffusion Models, by Zander W. Blasingame and Chen Liu
Abstract:The optimization of the latents and parameters of diffusion models with respect to some differentiable metric defined on the output of the model is a challenging and complex problem. The sampling for diffusion models is done by solving either the probability flow ODE or diffusion SDE wherein a neural network approximates the score function allowing a numerical ODE/SDE solver to be used. However, naive backpropagation techniques are memory intensive, requiring the storage of all intermediate states, and face additional complexity in handling the injected noise from the diffusion term of the diffusion SDE. We propose a novel family of bespoke ODE solvers to the continuous adjoint equations for diffusion models, which we call AdjointDEIS. We exploit the unique construction of diffusion SDEs to further simplify the formulation of the continuous adjoint equations using exponential integrators. Moreover, we provide convergence order guarantees for our bespoke solvers. Significantly, we show that continuous adjoint equations for diffusion SDEs actually simplify to a simple ODE. Lastly, we demonstrate the effectiveness of AdjointDEIS for guided generation with an adversarial attack in the form of the face morphing problem. Our code will be released at https: //github.com/zblasingame/AdjointDEIS.
Submission history
From: Zander Blasingame [view email]
[v1]
Thu, 23 May 2024 19:51:33 UTC (1,570 KB)
[v2]
Fri, 1 Nov 2024 19:27:35 UTC (2,474 KB)
[v3]
Tue, 21 Jan 2025 19:32:07 UTC (2,478 KB)
Source link
lol