Completed
outtakes :
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Intro to Neural Networks and Backpropagation - Building Micrograd
Automatically move to the next video in the Classroom when playback concludes
- 1 intro
- 2 micrograd overview
- 3 derivative of a simple function with one input
- 4 derivative of a function with multiple inputs
- 5 starting the core Value object of micrograd and its visualization
- 6 manual backpropagation example #1: simple expression
- 7 preview of a single optimization step
- 8 manual backpropagation example #2: a neuron
- 9 implementing the backward function for each operation
- 10 implementing the backward function for a whole expression graph
- 11 fixing a backprop bug when one node is used multiple times
- 12 breaking up a tanh, exercising with more operations
- 13 doing the same thing but in PyTorch: comparison
- 14 building out a neural net library multi-layer perceptron in micrograd
- 15 creating a tiny dataset, writing the loss function
- 16 collecting all of the parameters of the neural net
- 17 doing gradient descent optimization manually, training the network
- 18 summary of what we learned, how to go towards modern neural nets
- 19 walkthrough of the full code of micrograd on github
- 20 real stuff: diving into PyTorch, finding their backward pass for tanh
- 21 conclusion
- 22 outtakes :