Abstract:
Diffusion Probabilistic Models (DPMs) have received much attention in generative modeling. They have achieved state-of-art performance on many machine learning tasks such as image generation, audio wave synthesis, text-to-image generation and molecule designs, to name a few.
A DPM consists of a forward stochastic procedure to gradually add noise to data, and a reverse procedure to transform random noises to realistic samples. The generated samples from DPM is obtained by iteratively running reverse procedure. In this talk, we briefly introduce some background knowledge of DPM, including both its Variational Lower Bound formulation and SDE interpretation. We also give a few successful applications of DPM in modern generative modeling tasks.