Scalable and Reliable Inference for Probabilistic Modeling

Scalable and Reliable Inference for Probabilistic Modeling

Simons Institute via YouTube Direct link

Intro

1 of 31

1 of 31

Intro

Class Central Classrooms beta

YouTube playlists curated by Class Central.

Classroom Contents

Scalable and Reliable Inference for Probabilistic Modeling

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Many Areas are Revolutionized by Data
  3. 3 Probabilistic Modeling Pipeline
  4. 4 Scalable and Reliable Inference
  5. 5 Talk Outline
  6. 6 Metropolis-Hastings (MH)
  7. 7 Minibatch to scale MH
  8. 8 Empirical Verification
  9. 9 A: Poisson-Minibatching
  10. 10 Guaranteed Exactness
  11. 11 Guaranteed Scalability
  12. 12 Gaussian Mixture
  13. 13 Logistic Regression on MNIST
  14. 14 Gibbs sampling (Geman et al. 1984) • De facto inference method for graphical models
  15. 15 Inference on Graphical Models
  16. 16 Poisson-Minibatching for Gibbs Sampling
  17. 17 Theoretically-Guaranteed Inference
  18. 18 Question 1
  19. 19 Why Deep Learning Needs Reliable Inference
  20. 20 Stochastic gradient MCMC
  21. 21 Improvements for SG-MCMC
  22. 22 Question 2 How do you efficiently explore the city? By car or on foot?
  23. 23 Problem Analysis
  24. 24 Our solution Cyclical stepsize schedule
  25. 25 CSG-MCMC Details Introduce a system temperaturer to control the sampler's behavior
  26. 26 Convergence Guarantees
  27. 27 Mixture of 25 Gaussians
  28. 28 Bayesian Neural Networks
  29. 29 ImageNet
  30. 30 Efficient Inference for Reliable Deep Learning
  31. 31 Push the Frontier

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.