SetFit with SBERT for Few-Shot Text Classification - Multi-Class and Multi-Label
Discover AI via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn to implement SetFit with SBERT for text classification in this 28-minute coding tutorial that focuses on few-shot learning applications. Explore how SetFit leverages Sentence Transformers to generate dense embeddings from paired sentences through a two-phase approach. Master the initial fine-tuning phase that employs contrastive training with positive and negative pairs created through in-class and out-class selection, followed by the classification head training on encoded embeddings. Discover how this methodology excels at both multi-class and multi-label text classification tasks with limited training data. Gain practical insights into the theoretical foundations of SetFit and understand how pre-trained SBERT Sentence transformers enhance classification-based similarity tasks in Natural Language Processing. Based on research from "Efficient Few-Shot Learning Without Prompts," this tutorial sets the foundation for a subsequent video covering detailed SetFit hyperparameter fine-tuning.
Syllabus
CODE SetFit w/ SBERT for Text Classification (Few-Shot Learning) multi-class multi-label (SBERT 44)
Taught by
Discover AI