Theoretical Machine Learning Seminar
Generative Modeling by Estimating Gradients of the Data Distribution
Existing generative models are typically based on explicit representations of probability distributions (e.g., autoregressive or VAEs) or implicit sampling procedures (e.g., GANs). We propose an alternative approach based on modeling directly the vector field of gradients of the data distribution (scores). Our framework allows flexible energy-based model architectures, requires no sampling during training or the use of adversarial training methods. Using annealed Langevin dynamics, we produces samples comparable to GANs on MNIST, CelebA and CIFAR-10 datasets, achieving a new state-of-the-art inception score of 8.91 on CIFAR-10. Finally, I will discuss challenges in evaluating bias and generalization in generative models.
Date & Time
Location
Remote Access Only - see link belowSpeakers
Affiliation
Event Series
Categories
Notes
Please note: interested participants will need to fill out this google form in advance to obtain access to this seminar. Once approved, you will receive the login details. Members of the IAS community do not need to fill out this form - the login details will be emailed to you.