Completed
Theoretically-Guaranteed Inference
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Scalable and Reliable Inference for Probabilistic Modeling
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Many Areas are Revolutionized by Data
- 3 Probabilistic Modeling Pipeline
- 4 Scalable and Reliable Inference
- 5 Talk Outline
- 6 Metropolis-Hastings (MH)
- 7 Minibatch to scale MH
- 8 Empirical Verification
- 9 A: Poisson-Minibatching
- 10 Guaranteed Exactness
- 11 Guaranteed Scalability
- 12 Gaussian Mixture
- 13 Logistic Regression on MNIST
- 14 Gibbs sampling (Geman et al. 1984) • De facto inference method for graphical models
- 15 Inference on Graphical Models
- 16 Poisson-Minibatching for Gibbs Sampling
- 17 Theoretically-Guaranteed Inference
- 18 Question 1
- 19 Why Deep Learning Needs Reliable Inference
- 20 Stochastic gradient MCMC
- 21 Improvements for SG-MCMC
- 22 Question 2 How do you efficiently explore the city? By car or on foot?
- 23 Problem Analysis
- 24 Our solution Cyclical stepsize schedule
- 25 CSG-MCMC Details Introduce a system temperaturer to control the sampler's behavior
- 26 Convergence Guarantees
- 27 Mixture of 25 Gaussians
- 28 Bayesian Neural Networks
- 29 ImageNet
- 30 Efficient Inference for Reliable Deep Learning
- 31 Push the Frontier