Denoising MCMC for Accelerating Diffusion-Based Generative Models

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 39
  • Download : 0
The sampling process of diffusion models can be interpreted as solving the reverse stochastic differential equation (SDE) or the ordinary differential equation (ODE) of the diffusion process, which often requires up to thousands of discretization steps to generate a single image. This has sparked a great interest in developing efficient integration techniques for reverse-S/ODEs. Here, we propose an orthogonal approach to accelerating score-based sampling: Denoising MCMC (DMCMC). DMCMC first uses MCMC to produce initialization points for reverse-S/ODE in the product space of data and diffusion time. Then, a reverse-S/ODE integrator is used to denoise the initialization points. Since MCMC traverses close to the data manifold, the cost of producing a clean sample for DMCMC is much less than that of producing a clean sample from noise. Denoising Langevin Gibbs, an instance of DMCMC, successfully accelerates all six reverse-S/ODE integrators considered in this work, and achieves state-of-the-art results: in the limited number of score function evaluation (NFE) setting on CIFAR10, we have 3.25 FID with ≈ 10 NFE and 2.49 FID with ≈ 16 NFE. On CelebA-HQ-256, we have 6.99 FID with ≈ 160 NFE, which beats the current best record of Kim et al. (2022a) among score-based models, 7.16 FID with 4000 NFE. Code: https://github.com/1202kbs/DMCMC.
Publisher
International Machine Learning Society (IMLS)
Issue Date
2023-07
Language
English
Citation

40th International Conference on Machine Learning, ICML 2023

URI
http://hdl.handle.net/10203/316305
Appears in Collection
AI-Conference Papers(학술대회논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0