Denoising diffusion probabilistic models (DDPMs) represent an entirely new class of generative AI methods that have yet to be fully explored. They use Langevin dynamics, represented as stochastic differential equations, to describe a process that transforms data into noise, the forward process, and a process that transforms noise into generated data, the reverse process. Many of these methods utilize auxiliary variables that formulate the data as a ``position" variable, and the auxiliary variables are referred to as ``velocity", ``acceleration", etc. In this sense, it is possible to ``critically damp" the dynamics. Critical damping has been successfully introduced in Critically-Damped Langevin Dynamics (CLD) and Critically-Damped Third-Order Langevin Dynamics (TOLD++), but has not yet been applied to dynamics of arbitrary order. The proposed methodology generalizes Higher-Order Langevin Dynamics (HOLD), a recent state-of-the-art diffusion method, by introducing the concept of critical damping from systems analysis. Similarly to TOLD++, this work proposes an optimal set of hyperparameters in the $n$-dimensional case, where HOLD leaves these to be user defined. Additionally, this work provides closed-form solutions for the mean and covariance of the forward process that greatly simplify its implementation. Experiments are performed on the CIFAR-10 and CelebA-HQ $256 \times 256$ datasets, and validated against the FID metric.
翻译:暂无翻译