Completed
– Learning representations
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Gradient Descent and the Backpropagation Algorithm
Automatically move to the next video in the Classroom when playback concludes
- 1 – Supervised learning
- 2 – Parametrised models
- 3 – Block diagram
- 4 – Loss function, average loss
- 5 – Gradient descent
- 6 – Traditional neural nets
- 7 – Backprop through a non-linear function
- 8 – Backprop through a weighted sum
- 9 – PyTorch implementation
- 10 – Backprop through a functional module
- 11 – Backprop through a functional module
- 12 – Backprop in practice
- 13 – Learning representations
- 14 – Shallow networks are universal approximators!
- 15 – Multilayer architectures == compositional structure of data