Overview
Learn essential techniques for preparing data and understanding transformer architecture in this comprehensive lecture. Explore detailed methods for text and image preparation specifically designed for transformer models, including tokenization and preprocessing steps. Dive deep into the transformer architecture, understanding its key components and mechanisms. Examine the pretraining process and its importance in developing effective transformer models. Master practical approaches to data handling and model implementation through clear explanations and real-world examples.
Syllabus
Recording starts
Lecture starts / Announcements
Preparing text for a transformer
Preparing images for a transformer
Transformer
Pretraining
Lecture ends
Taught by
UofU Data Science