Learn how to build natural language processing (NLP) applications with pretrained transformers in Hugging Face, the popular machine learning platform.
Overview
Syllabus
Introduction
- Building NLP apps with Transformers
- Course coverage and prerequisites
- Setting up the exercise files
- Question-answering in NLP
- Types of question-answering
- Building a Qu-An pipeline
- The SQuAD metric
- Evaluating Qu-An performance
- Text summarization in NLP
- The BART model architecture
- Summarization with pipelines
- The ROUGE score
- Evaluating with ROUGE
- Natural language generation in NLP
- Content creation with Transformers
- Conversation generation
- Chatbot conversation example
- Machine translation in NLP
- Translating with Hugging Face Transformers
- Training a custom model
- Loading a Hugging Face dataset
- Encoding and preprocessing the dataset
- Customizing the model architecture
- Training the sentiment model
- Predicting with the custom model
- Inference challenges with Transformers
- Customizing pretrained models
- Model compression overview
- Serving multiple models
- Continuing with Hugging Face
Taught by
Kumaran Ponnambalam