Overview
Learn to optimize hyperparameters for SetFit/SBERT text classification through a detailed 22-minute tutorial that demonstrates how to autotune the SetFit system for optimal performance with your dataset. Explore the two-phase process where Sentence Transformers first generate dense embeddings through contrastive training with positive and negative pairs, followed by training a classification head on encoded embeddings. Master the implementation of few-shot learning techniques as the system automatically discovers the best performance parameters. Follow along with provided Jupyter notebooks to understand how unseen examples are processed through the fine-tuned Sentence Transformer to generate embeddings for classification predictions. Dive into practical applications of natural language processing using the SetFit framework, which leverages Sentence Transformers' capabilities for efficient text classification with limited labeled data.
Syllabus
SETFIT - HYPER Parameter Optimization for SBERT Text Classification (SBERT 45)
Taught by
Discover AI