Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Optimization

MITCBMM via YouTube

Overview

Explore optimization techniques in this 56-minute tutorial from the MIT BMM Summer Course 2018, presented by Kevin Smith. Dive into key concepts such as maximum likelihood estimation, cost functions, and gradient descent. Learn about grid search, local vs. global minima, and the differences between convex and non-convex functions. Examine practical applications through examples like the balls in urns problem and the lecture attendance problem. Discover how optimization applies to machine learning, including stochastic gradient descent, regularization, and sparse coding. Gain insights into multi-dimensional gradients, differentiable functions, and the importance of momentum in optimization algorithms. This comprehensive tutorial provides a solid foundation for understanding and implementing optimization techniques in various computational and analytical contexts.

Syllabus

What you will learn
Materials and notes
What is the likelihood?
Example: Balls in urns
Maximum likelihood estimator
Cost functions
Likelihood - Cost
Grid search (brute force)
Local vs. global minima
Convex vs. non-convex functions
Implementation
Lecture attendance problem
Multi-dimensional gradients
Multi-dimensional gradient descent
Differentiable functions
Optimization for machine learning
Stochastic gradient descent
Regularization
Sparse coding
Momentum
Important terms

Taught by

MITCBMM

Reviews

Start your review of Optimization

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.