Large Scale Machine Learning and Convex Optimization - Lecture 1
Hausdorff Center for Mathematics via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the intersection of large-scale machine learning and convex optimization in this comprehensive lecture by Francis Bach. Delve into the challenges of handling extensive datasets with numerous observations and high-dimensional features. Learn about online algorithms like stochastic gradient descent and their advantages over batch algorithms for processing large-scale data. Examine the optimal convergence rates for general convex functions and strongly-convex functions. Discover how the smoothness of loss functions can be leveraged to develop innovative algorithms with improved performance. Investigate a novel Newton-based stochastic approximation algorithm that achieves faster convergence rates without strong convexity assumptions. Gain insights into the practical applications of combining batch and online algorithms, including the potential for linear convergence rates in strongly convex problems with computational efficiency comparable to stochastic gradient descent.
Syllabus
Francis Bach: Large scale Machine Learning and Convex Optimization (Lecture 1)
Taught by
Hausdorff Center for Mathematics