SETFIT Few-Shot Learning for SBERT Text Classification - Part 43
Discover AI via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn about SETFIT's groundbreaking few-shot learning methodology for text classification in this 19-minute technical video that demonstrates how it outperforms GPT-3 without using prompts. Explore the theoretical foundations of SETFIT and discover how it leverages pre-trained SBERT Sentence Transformers for exceptional performance in both multi-class and multi-label classification tasks, even with limited training samples per class. Dive into the efficient few-shot learning approach based on the research paper "Efficient Few-Shot Learning Without Prompts," examining how SBERT Sentence Transformers can be applied to classification-based similarity tasks in Natural Language Processing. Follow along through key concepts including language models, context learning, problem statements, training data set construction, and expert model fine-tuning, setting the stage for a subsequent hands-on coding implementation.
Syllabus
Introduction
Language Model
Context Learning
Problem Statement
Training Data Set
Build Training Data Set
FineTune Expert Model
Abstract
Classification
Summary
Taught by
Discover AI