Conversational AI with Transformer Models - Building Blocks and Optimization Techniques

Conversational AI with Transformer Models - Building Blocks and Optimization Techniques

Databricks via YouTube Direct link

Components of NLU Engine

6 of 21

6 of 21

Components of NLU Engine

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Conversational AI with Transformer Models - Building Blocks and Optimization Techniques

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Why Conversational Al/chatbots?
  3. 3 Chatbot Conversation Framework
  4. 4 Use-case in hand
  5. 5 Chatbot Flow Diagram
  6. 6 Components of NLU Engine
  7. 7 Transformers for Intent Classification
  8. 8 BERT: Bidirectional Encoder Representations from Transformers
  9. 9 Masked Language Model
  10. 10 Next Sentence Prediction
  11. 11 BERT: CLS token for classification
  12. 12 Different models with accuracy and size over time
  13. 13 Use-case data summary
  14. 14 Model Training
  15. 15 Efficient Model Inference
  16. 16 Knowledge Distillation
  17. 17 Quantization
  18. 18 No padding
  19. 19 Productizing BERT for CPU Inference
  20. 20 Ensembling LUIS and DistilBERT
  21. 21 Team behind the project

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.