Overview
Follow along with a hands-on lab video demonstrating how to fine-tune DistilBERT for sentiment analysis of restaurant reviews. Learn the fundamentals of knowledge distillation before diving into a practical implementation using a restaurant review dataset. Master PyTorch Dataset and Dataloader concepts while building a custom classification model from scratch. Explore fine-tuning techniques for DistilBERT, including performance evaluation and weight freezing strategies. Conclude by implementing the same solution using the streamlined HuggingFace Transformers library and Trainer class. Access the provided Colab notebook to practice alongside the demonstration, with comprehensive references to academic papers and documentation for deeper understanding of the concepts covered.
Syllabus
- - Knowledge Distillation
- - Restaurant Review Dataset
- - PyTorch Dataset and Dataloader
- - Fine-tuning DistilBERT for Classification Tasks
- - Evaluation
- - Freezing Weights of Base Model
- - Using Hugging Face Transformers for Fine-Tuning Trainer class
Taught by
Donato Capitella