Overview
Dive into a comprehensive 3-hour 35-minute video tutorial on transformers, starting from the basics and progressing to advanced concepts. Explore key topics including self-attention, multihead attention, position encoding, and layer normalization. Gain a deep understanding of transformer architecture through an in-depth analysis, followed by practical coding sessions for both encoder and decoder implementations. Learn about sentence tokenization techniques and conclude with insights into training and inference processes. Perfect for those looking to master transformer technology from the ground up.
Syllabus
Thank you for 100K!
Transformer Overview
Self Attention
Multihead Attention
Position Encoding
Layer Normalization
Architecture Deep Dive
Encoder Code
Decoder Code
Sentence Tokenization
Training and Inference
Taught by
CodeEmporium