Completed
Wrap Up
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Stochastic Gradient Descent and Machine Learning - Lecture 1
Automatically move to the next video in the Classroom when playback concludes
- 1 Stochastic Gradient Descent and Machine Learning Lecture 1
- 2 5 different facets of optimization
- 3 Optimization
- 4 1. Iterative methods
- 5 Blackbox oracles
- 6 2. Gradient descent
- 7 3. Newton's method
- 8 Cheap gradient principle
- 9 Fixed points of GD
- 10 Proposition
- 11 Proof
- 12 Convexity
- 13 Examples of convex functions
- 14 Theorem
- 15 Proof
- 16 gx is subgradient of a convex function f at x
- 17 Example
- 18 Theorem
- 19 Claim
- 20 Wrap Up