Soft Truncation: A Universal Training Technique of Score-based Diffusion Model for High Precision Score Estimation

Cited 0 time in webofscience Cited 0 time in scopus
  • Hit : 181
  • Download : 0
Recent advances in diffusion models bring state-of-the-art performance on image generation tasks. However, empirical results from previous research in diffusion models imply an inverse correlation between density estimation and sample generation performances. This paper investigates with sufficient empirical evidence that such inverse correlation happens because density estimation is significantly contributed by small diffusion time, whereas sample generation mainly depends on large diffusion time. However, training a score network well across the entire diffusion time is demanding because the loss scale is significantly imbalanced at each diffusion time. For successful training, therefore, we introduce Soft Truncation, a universally applicable training technique for diffusion models, that softens the fixed and static truncation hyperparameter into a random variable. In experiments, Soft Truncation achieves state-of-the-art performance on CIFAR-10, CelebA, CelebA-HQ 256×256, and STL-10 datasets.
Publisher
International Conference on Machine Learning
Issue Date
2022-07-18
Language
English
Citation

The 39th International Conference on Machine Learning, ICML 2022

URI
http://hdl.handle.net/10203/299794
Appears in Collection
MA-Conference Papers(학술회의논문)IE-Conference Papers(학술회의논문)
Files in This Item
There are no files associated with this item.

qr_code

  • mendeley

    citeulike


rss_1.0 rss_2.0 atom_1.0