Conversational AI with Transformer Models - Building Blocks and Optimization Techniques
Databricks via YouTube
Overview
Syllabus
Intro
Why Conversational Al/chatbots?
Chatbot Conversation Framework
Use-case in hand
Chatbot Flow Diagram
Components of NLU Engine
Transformers for Intent Classification
BERT: Bidirectional Encoder Representations from Transformers
Masked Language Model
Next Sentence Prediction
BERT: CLS token for classification
Different models with accuracy and size over time
Use-case data summary
Model Training
Efficient Model Inference
Knowledge Distillation
Quantization
No padding
Productizing BERT for CPU Inference
Ensembling LUIS and DistilBERT
Team behind the project
Taught by
Databricks