July 19, 2024


Source: https://arxiv.org/pdf/2210.03142.pdf

High-resolution picture synthesis using denoising diffusion probabilistic models (DDPMs) with classifier-free guidings, such as DALLE 2, GLIDE, and Imagen, has reached state-of-the-art results. The drawback of such models is that their inference procedure necessitates hundreds of evaluations of both a class-conditional model and an unconditional model, making them unfeasible to compute for many practical applications.

In their recent study, On Distillation of Guided Diffusion Models, researchers offer a novel method for distilling classifier-free guided diffusion models with great sampling efficiency. The generated models perform as well as the original model but with up to 256 times less sampling steps.

The distillation method used by the researchers consists of two steps: A trained, guided teacher model is given, and a single student model is first used to match the output of the instructor’s two diffusion models. This learnt student model is gradually reduced to a model with fewer steps. The resulting single distilled model can effectively trade sample quality and diversity for various guiding strengths.

Source: https://arxiv.org/pdf/2210.03142.pdf

The suggested sampling technique uses a deterministic sampler and a brand-new stochastic sampling procedure. Two times the initial step length is used for one deterministic sampling step, followed by one stochastic step that is reversed (i.e., perturbed with noise) using the initial step length. 

The team’s strategy was used in their empirical investigation to create classifier-free guidance DDPMs, and they ran picture-generating tests on the ImageNet 6464 and CIFAR-10 datasets. The findings demonstrate that, although being up to 256 times faster to sample from, the suggested approach may produce “visually decent” samples in as little as one step and FID/IS (Frechet Inception Distance/Inception) scores that are comparable to those of the original baseline models.

The work shows how the suggested method can effectively handle the high computing costs that have restricted the use of denoising diffusion probabilistic models.

This Article is written as a research summary article by Marktechpost Staff based on the research paper 'ON DISTILLATION OF GUIDED DIFFUSION MODELS'. All Credit For This Research Goes To Researchers on This Project. Check out the paper and reference article.
Please Don't Forget To Join Our ML Subreddit

Ashish kumar is a consulting intern at MarktechPost. He is currently pursuing his Btech from the Indian Institute of technology(IIT),kanpur. He is passionate about exploring the new advancements in technologies and their real life application.




Source link