Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Introduction to Optimization

MITCBMM via YouTube

Overview

Dive into the fundamentals of optimization in this 1-hour 12-minute tutorial led by Kevin Smith from MIT. Explore key concepts such as maximum likelihood estimation, cost functions, and gradient descent methods. Learn about convex and non-convex functions, local and global minima, and their implications in optimization problems. Discover practical applications through examples like balls in urns and coin flips. Advance to multi-dimensional gradients and their role in machine learning. Gain insights into stochastic gradient descent, regularization techniques, and sparse coding. Perfect for those looking to enhance their understanding of optimization principles and their applications in various fields, including machine learning.

Syllabus

Intro
What you will learn
Before we start
What is the likelihood?
Example: Balls in urns
Maximum likelihood estimator
Example: Coin flips
Likelihood - Cost
Back to the urn problem...
Grid search (brute force)
Local vs. global minima
Convex vs. non-convex functions
Implementation
Lecture attendance problem
Multi-dimensional gradients
Multi-dimensional gradient descent
Differentiable functions
Optimization for machine learning
Stochastic gradient descent
Regularization
Sparse coding

Taught by

MITCBMM

Reviews

Start your review of Introduction to Optimization

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.