Learn how to use text summarization with the Hugging Face AI.
Overview
Syllabus
Introduction
- AI Text Summarization with Hugging Face
- Prerequisites
- Extractive text summarization
- Intermediate representations for extractive summarization
- Evaluation metrics for summaries
- Exploring Hugging Face
- Signing up for Hugging Face
- The sumy library for extractive summarization
- Extractive text summarization on Hugging Face
- Abstractive text summarization
- Abstractive summarization using the Hosted Inference API on Hugging Face
- Sequence-to-sequence models
- Attention in sequence-to-sequence models
- A brief introduction to Transformers
- Transformers in Hugging Face
- Using Colab to work with Hugging Face Transformers
- Loading the CNN Daily Mail dataset
- Cleaning text data
- Generating summaries with Hugging Face Transformers
- Evaluating summaries using ROUGE scores
- Summarizing text and computing aggregate ROUGE scores
- Understanding tokenizers
- Fine-tuning the T5 small model
- Pushing the model to the Hugging Face Hub
- Summarizing text using the fine-tuned model
- Accessing the BBC dataset on Google Drive
- Instantiating and cleaning the BBC News summaries dataset
- Generating summaries using Pegasus
- Generating multiple summaries and computing aggregate ROUGE scores
- Generating summaries using BART
- Computing ROUGE metrics for a set of summaries
- Summary and next steps
Taught by
Janani Ravi