Completed
BERT: Bidirectional Encoder Representations from Transformers
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Conversational AI with Transformer Models - Building Blocks and Optimization Techniques
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Why Conversational Al/chatbots?
- 3 Chatbot Conversation Framework
- 4 Use-case in hand
- 5 Chatbot Flow Diagram
- 6 Components of NLU Engine
- 7 Transformers for Intent Classification
- 8 BERT: Bidirectional Encoder Representations from Transformers
- 9 Masked Language Model
- 10 Next Sentence Prediction
- 11 BERT: CLS token for classification
- 12 Different models with accuracy and size over time
- 13 Use-case data summary
- 14 Model Training
- 15 Efficient Model Inference
- 16 Knowledge Distillation
- 17 Quantization
- 18 No padding
- 19 Productizing BERT for CPU Inference
- 20 Ensembling LUIS and DistilBERT
- 21 Team behind the project