Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Stochastic Gradient Descent and Backpropagation

Alfredo Canziani via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into a comprehensive lecture on stochastic gradient descent and backpropagation, exploring parameterized models, loss functions, and gradient-based methods. Learn how to implement neural networks in PyTorch and understand the generalized form of backpropagation. Examine concrete examples, discuss Jacobian matrix dimensions, and explore various neural net modules while computing their gradients. Gain insights into softmax, logsoftmax, and practical tricks for backpropagation to enhance your understanding of deep learning concepts and techniques.

Syllabus

– Week 2 – Lecture
– Gradient Descent Optimization Algorithm
– Advantages of SGD, Backpropagation for Traditional Neural Net
– PyTorch implementation of Neural Network and a Generalized Backprop Algorithm
– Basic Modules - LogSoftMax
– Practical Tricks for Backpropagation
– Computing gradients for NN modules and Practical tricks for Back Propagation

Taught by

Alfredo Canziani

Reviews

Start your review of Stochastic Gradient Descent and Backpropagation

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.