Completed
– Basic Modules - LogSoftMax
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Stochastic Gradient Descent and Backpropagation
Automatically move to the next video in the Classroom when playback concludes
- 1 – Week 2 – Lecture
- 2 – Gradient Descent Optimization Algorithm
- 3 – Advantages of SGD, Backpropagation for Traditional Neural Net
- 4 – PyTorch implementation of Neural Network and a Generalized Backprop Algorithm
- 5 – Basic Modules - LogSoftMax
- 6 – Practical Tricks for Backpropagation
- 7 – Computing gradients for NN modules and Practical tricks for Back Propagation