Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn about the Stochastic Gradient Descent (SGD) algorithm and its application in optimizing Support Vector Machine (SVM) objectives in this comprehensive lecture from the University of Utah Data Science program. Explore the fundamental concepts, mathematical foundations, and practical implementation of SGD for SVM optimization, gaining valuable insights into this essential machine learning technique. Master the principles behind this powerful optimization method that has become crucial in training large-scale machine learning models efficiently.
Syllabus
Machine Learning: Lecture 22: Stochastic Gradient Descent for SVM
Taught by
UofU Data Science