Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the implicit regularization effect of Stochastic Gradient Descent (SGD) in this comprehensive lecture by Pierfrancesco Beneventano from Princeton University. Delve into the intricacies of SGD's impact on machine learning models, examining how this optimization algorithm influences model performance and generalization. Gain insights into the theoretical foundations and practical implications of SGD's implicit regularization, enhancing your understanding of this crucial aspect of deep learning optimization.
Syllabus
Towards Understanding the Implicit Regularization Effect of SGD
Taught by
MITCBMM