Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Dive into a comprehensive 57-minute video tutorial on implementing PyTorch Transformers from scratch, based on the groundbreaking "Attention is all you need" paper. Explore the original transformer architecture, starting with a detailed paper review and progressing through key components such as the attention mechanism, transformer blocks, encoder, and decoder. Learn how to assemble these elements to create a complete Transformer model, and gain practical insights through a small example and error-fixing session. Benefit from additional resources, including recommended courses and free materials, to further enhance your understanding of machine learning, deep learning, and natural language processing.
Syllabus
- Introduction
- Paper Review
- Attention Mechanism
- TransformerBlock
- Encoder
- DecoderBlock
- Decoder
- Putting it togethor to form The Transformer
- A Small Example
- Fixing Errors
- Ending
Taught by
Aladdin Persson