Completed
fixing the initial loss
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Building Makemore - Activations & Gradients, BatchNorm
Automatically move to the next video in the Classroom when playback concludes
- 1 intro
- 2 starter code
- 3 fixing the initial loss
- 4 fixing the saturated tanh
- 5 calculating the init scale: “Kaiming init”
- 6 batch normalization
- 7 batch normalization: summary
- 8 real example: resnet50 walkthrough
- 9 summary of the lecture
- 10 just kidding: part2: PyTorch-ifying the code
- 11 viz #1: forward pass activations statistics
- 12 viz #2: backward pass gradient statistics
- 13 the fully linear case of no non-linearities
- 14 viz #3: parameter activation and gradient statistics
- 15 viz #4: update:data ratio over time
- 16 bringing back batchnorm, looking at the visualizations
- 17 summary of the lecture for real this time