Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the concept of implicit regularization in deep learning through this comprehensive lecture from the Deep Learning Boot Camp. Delve into topics such as boosting, complexity control, optimization landscapes, and biases in matrix completion. Understand the goal of learning through practical examples and gain insights into stochastic optimization techniques. Examine the intricacies of gradient descent and stochastic gradient descent, and their roles in implicit regularization. Learn from Nati Srebro of the Toyota Technological Institute at Chicago as he provides an in-depth analysis of this crucial aspect of machine learning.
Syllabus
Introduction
Boosting
Complexity Control
Optimization Landscape
Biases
Matrix Completion
Gradient Descent
Outline
Goal of Learning
Example
Stochastic Optimization
Recap
Stochastic Gradient Descent
Taught by
Simons Institute