Goto

Collaborating Authors

 diffusion probabilistic model


Patch Diffusion: Faster and More Data-Efficient Training of Diffusion Models

Neural Information Processing Systems

Diffusion models are powerful, but they require a lot of time and data to train. We propose Patch Diffusion, a generic patch-wise training framework, to significantly reduce the training time costs while improving data efficiency, which thus helps democratize diffusion model training to broader users. At the core of our innovations is a new conditional score function at the patch level, where the patch location in the original image is included as additional coordinate channels, while the patch size is randomized and diversified throughout training to encode the cross-region dependency at multiple scales. Sampling with our method is as easy as in the original diffusion model.










UnsupervisedRepresentationLearningfrom Pre-trainedDiffusionProbabilisticModels

Neural Information Processing Systems

Drawing inspiration from this method that uses the prior knowledges (class label) to fill the gap, we aim to inversely extract the knowledges from the gap, i.e., learn representations that can help to fill the gap.