Stochastic Gradient Descent and Machine Learning - Lecture 1
International Centre for Theoretical Sciences via YouTube
Overview
Syllabus
Stochastic Gradient Descent and Machine Learning Lecture 1
5 different facets of optimization
Optimization
1. Iterative methods
Blackbox oracles
2. Gradient descent
3. Newton's method
Cheap gradient principle
Fixed points of GD
Proposition
Proof
Convexity
Examples of convex functions
Theorem
Proof
gx is subgradient of a convex function f at x
Example
Theorem
Claim
Wrap Up
Taught by
International Centre for Theoretical Sciences