Bootstrap

[读论文] [ 蒸馏-diffusion] BOOT : Data-free Distillation of Denoising DiffusionModels with Bootstrapping

苹果 宾夕法尼亚

摘要

Diffusion models have demonstrated excellent potential for generating diverse images.
However, their performance often suffers from slow generation due to iterative denoising. Knowledge distillation has been recently proposed as are medy which can reduce the number of inference steps to one or a few, without significant quality degradation.
However, existing distillation methods either require significant amounts of offline computation for generating synthetic training data from the teacher model, or need to perform expensive online learning with the help
;