Completed
splitting up the dataset into train/val/test splits and why
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Building Makemore - MLP
Automatically move to the next video in the Classroom when playback concludes
- 1 intro
- 2 Bengio et al. 2003 MLP language model paper walkthrough
- 3 re-building our training dataset
- 4 implementing the embedding lookup table
- 5 implementing the hidden layer + internals of torch.Tensor: storage, views
- 6 implementing the output layer
- 7 implementing the negative log likelihood loss
- 8 summary of the full network
- 9 introducing F.cross_entropy and why
- 10 implementing the training loop, overfitting one batch
- 11 training on the full dataset, minibatches
- 12 finding a good initial learning rate
- 13 splitting up the dataset into train/val/test splits and why
- 14 experiment: larger hidden layer
- 15 visualizing the character embeddings
- 16 experiment: larger embedding size
- 17 summary of our final code, conclusion
- 18 sampling from the model
- 19 google collab new!! notebook advertisement