Completed
Transformer at a high level
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
The Complete Guide to Transformer Neural Networks
Automatically move to the next video in the Classroom when playback concludes
- 1 Introduction
- 2 Transformer at a high level
- 3 Why Batch Data? Why Fixed Length Sequence?
- 4 Embeddings
- 5 Positional Encodings
- 6 Query, Key and Value vectors
- 7 Masked Multi Head Self Attention
- 8 Residual Connections
- 9 Layer Normalization
- 10 Decoder
- 11 Masked Multi Head Cross Attention
- 12
- 13 Tokenization & Generating the next translated word
- 14 Transformer Inference Example