Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Scalable and Reliable Inference for Probabilistic Modeling

Simons Institute via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore scalable and reliable inference techniques for probabilistic modeling in this 42-minute lecture by Ruqi Zhang from UT Austin. Delve into the Metropolis-Hastings algorithm and learn how minibatching can enhance its scalability. Examine Poisson-Minibatching's guaranteed exactness and scalability through empirical verification on Gaussian mixtures and logistic regression. Investigate Gibbs sampling for graphical models and its Poisson-Minibatching adaptation. Discover why deep learning requires reliable inference and explore stochastic gradient MCMC improvements. Analyze the Cyclical Stepsize Schedule solution for efficient exploration, including convergence guarantees. Apply these concepts to mixture models, Bayesian neural networks, and ImageNet. Gain insights into pushing the frontier of efficient inference for reliable deep learning in this Joint IFML/CCSI Symposium talk from the Simons Institute.

Syllabus

Intro
Many Areas are Revolutionized by Data
Probabilistic Modeling Pipeline
Scalable and Reliable Inference
Talk Outline
Metropolis-Hastings (MH)
Minibatch to scale MH
Empirical Verification
A: Poisson-Minibatching
Guaranteed Exactness
Guaranteed Scalability
Gaussian Mixture
Logistic Regression on MNIST
Gibbs sampling (Geman et al. 1984) • De facto inference method for graphical models
Inference on Graphical Models
Poisson-Minibatching for Gibbs Sampling
Theoretically-Guaranteed Inference
Question 1
Why Deep Learning Needs Reliable Inference
Stochastic gradient MCMC
Improvements for SG-MCMC
Question 2 How do you efficiently explore the city? By car or on foot?
Problem Analysis
Our solution Cyclical stepsize schedule
CSG-MCMC Details Introduce a system temperaturer to control the sampler's behavior
Convergence Guarantees
Mixture of 25 Gaussians
Bayesian Neural Networks
ImageNet
Efficient Inference for Reliable Deep Learning
Push the Frontier

Taught by

Simons Institute

Reviews

Start your review of Scalable and Reliable Inference for Probabilistic Modeling

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.