Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Gradient Descent and Stochastic Gradient Descent

Paul Hand via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore gradient descent and stochastic gradient descent in deep neural networks through this 57-minute lecture. Delve into the effects of varying learning rates, examining the consequences of rates that are too high or too low. Analyze convergence rates for both gradient descent and stochastic gradient descent in the context of convex functions. Access accompanying notes for a comprehensive understanding of the topic. Part of Northeastern University's CS 7150 Summer 2020 Deep Learning course, this lecture covers introduction, gradient descent convergence, recovery theorem, proof and interpretation, gradient descent challenges, stochastic gradient descent, step sizes and learning rates, and associated challenges.

Syllabus

Introduction
Gradient Descent Convergence
Recovery Theorem
Proof
Interpretation
Gradient Descent Challenges
Stochastic Gradient Descent
Step sizes and learning rates
Challenges
Learning Rates

Taught by

Paul Hand

Reviews

Start your review of Gradient Descent and Stochastic Gradient Descent

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.