Intro to Neural Networks and Backpropagation - Building Micrograd

Intro to Neural Networks and Backpropagation - Building Micrograd

Andrej Karpathy via YouTube Direct link

manual backpropagation example #2: a neuron

8 of 22

8 of 22

manual backpropagation example #2: a neuron

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Intro to Neural Networks and Backpropagation - Building Micrograd

Automatically move to the next video in the Classroom when playback concludes

  1. 1 intro
  2. 2 micrograd overview
  3. 3 derivative of a simple function with one input
  4. 4 derivative of a function with multiple inputs
  5. 5 starting the core Value object of micrograd and its visualization
  6. 6 manual backpropagation example #1: simple expression
  7. 7 preview of a single optimization step
  8. 8 manual backpropagation example #2: a neuron
  9. 9 implementing the backward function for each operation
  10. 10 implementing the backward function for a whole expression graph
  11. 11 fixing a backprop bug when one node is used multiple times
  12. 12 breaking up a tanh, exercising with more operations
  13. 13 doing the same thing but in PyTorch: comparison
  14. 14 building out a neural net library multi-layer perceptron in micrograd
  15. 15 creating a tiny dataset, writing the loss function
  16. 16 collecting all of the parameters of the neural net
  17. 17 doing gradient descent optimization manually, training the network
  18. 18 summary of what we learned, how to go towards modern neural nets
  19. 19 walkthrough of the full code of micrograd on github
  20. 20 real stuff: diving into PyTorch, finding their backward pass for tanh
  21. 21 conclusion
  22. 22 outtakes :

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.