Overview
Dive into a comprehensive 1-hour 37-minute video lecture on Stochastic Gradient Descent. Explore the fundamentals of this random selection process, its algorithm, and how it differs from traditional Gradient Descent. Learn about the objectives, types, advantages, and disadvantages of Gradient Descent methods. Understand the concept of mini-batches and momentum in the optimization process. Discover why Stochastic Gradient Descent is necessary and how it converges. Cover topics such as the algorithm's workings, its comparison with standard Gradient Descent, and its applications in machine learning and optimization problems. Gain insights into this crucial technique used in training large-scale machine learning models and deep neural networks.
Syllabus
Agenda for the session.
Objective of Gradient Descent.
Gradient Descent - The Algorithm.
Types of Gradient Descent.
Stochastic Gradient Descent.
Is Stochastic Gradient Descent Same as Gradient Descent?.
Why Stochastic Gradient Descent is needed?.
How does Stochastic Gradient Descent algorithm work?.
Advantages of Gradient Descent.
Disadvantages of Stochastic Gradient Descent.
Mini batches in Gradient Descent.
Momentum in Gradient Descent.
Why does Stochastic Gradient Descent converge?.
Summarizing the session.
Taught by
Great Learning