This is a simple pytorch implementation of Your Diffusion Model is Secretly a Zero-Shot Classifier.
model.py is a minimal implementation of a conditional diffusion model with the ability of Bayesian Inference by Monte Carlo sampling. During training, it learns to generate MNIST digits conditioned on a class label. During inference, it samples pairs of
The conditioning roughly follows the method described in Classifier-Free Diffusion Guidance (also used in ImageGen). The model infuses timestep embeddings
At training time,
Increasing
The basic idea of a diffusion classifier is bayesian inference, that is,
A uniform prior over
which can be estimated by Monte Carlo sampling (see Your Diffusion Model is Secretly a Zero-Shot Classifier).
During bayesian inference, we no longer drop
We trained the conditioned diffusion model for 50 epochs, and performed sampling with
| Sample | Acc | Time |
|---|---|---|
| 20 | 98.27 | ~14 min |
| 50 | 98.98 | ~35 min |
| 100 | 99.23 | ~70 min |