Overview
Syllabus
Intro
Many Areas are Revolutionized by Data
Probabilistic Modeling Pipeline
Scalable and Reliable Inference
Talk Outline
Metropolis-Hastings (MH)
Minibatch to scale MH
Empirical Verification
A: Poisson-Minibatching
Guaranteed Exactness
Guaranteed Scalability
Gaussian Mixture
Logistic Regression on MNIST
Gibbs sampling (Geman et al. 1984) • De facto inference method for graphical models
Inference on Graphical Models
Poisson-Minibatching for Gibbs Sampling
Theoretically-Guaranteed Inference
Question 1
Why Deep Learning Needs Reliable Inference
Stochastic gradient MCMC
Improvements for SG-MCMC
Question 2 How do you efficiently explore the city? By car or on foot?
Problem Analysis
Our solution Cyclical stepsize schedule
CSG-MCMC Details Introduce a system temperaturer to control the sampler's behavior
Convergence Guarantees
Mixture of 25 Gaussians
Bayesian Neural Networks
ImageNet
Efficient Inference for Reliable Deep Learning
Push the Frontier
Taught by
Simons Institute