Completed
– How to summarise papers as @y0b1byte with Notion
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Recurrent Neural Networks, Vanilla and Gated - LSTM
Automatically move to the next video in the Classroom when playback concludes
- 1 – Good morning
- 2 – How to summarise papers as @y0b1byte with Notion
- 3 – Why do we need to go to a higher hidden dimension?
- 4 – Today class: recurrent neural nets
- 5 – Vector to sequence vec2seq
- 6 – Sequence to vector seq2vec
- 7 – Sequence to vector to sequence seq2vec2seq
- 8 – Sequence to sequence seq2seq
- 9 – Training a recurrent network: back propagation through time
- 10 – Training example: language model
- 11 – Vanishing & exploding gradients and gating mechanism
- 12 – The Long Short-Term Memory LSTM
- 13 – Jupyter Notebook and PyTorch in action: sequence classification
- 14 – Inspecting the activation values
- 15 – Closing remarks