Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore advanced techniques for scaling Bayesian inference in the context of big and complex data in this 47-minute lecture by David Dunson from Duke University. Delve into the computational challenges in machine learning, comparing traditional approaches with deep learning methods. Examine mixing rate problems, MCMC games, and strategies for handling large datasets. Learn about stochastic approximation, sparse linear programming, and theoretical guarantees in Bayesian inference. Investigate popular algorithms for logistic regression and Gaussian process models, understanding their limitations and potential improvements. Gain insights into why certain algorithms fail and consider innovative approaches to overcome these challenges in the field of machine learning and Bayesian statistics.
Syllabus
Intro
Machine Learning vs Deep Learning
Mixing Mixing Rate Problems
MCMC Games
Given Time
Big N Problems
Subsets
Stochastic approximation
Sparse linear program
Theoretical guarantees
Biometrika
MCMC
Logistic Regression
Gaussian Process Models
Popular Algorithms
Why Algorithms Fail
Are You Changing the Way
Taught by
Simons Institute