Fine-Tune High Performance Sentence Transformers With Multiple Negatives Ranking
James Briggs via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the process of fine-tuning high-performance sentence transformers using Multiple Negatives Ranking (MNR) loss in this 37-minute video tutorial. Learn about the evolution of transformer-produced sentence embeddings, from BERT cross-encoders to SBERT and beyond. Discover how MNR loss has revolutionized the field, enabling newer models to quickly outperform their predecessors. Dive into the implementation of MNR loss for fine-tuning sentence transformers, covering both a detailed approach and a simplified method using the sentence-transformers library. Gain insights into NLI training data, preprocessing techniques, and visual representations of SBERT fine-tuning and MNR loss. Compare results and understand the impact of this advanced technique on sentence embedding quality.
Syllabus
Intro
NLI Training Data
Preprocessing
SBERT Finetuning Visuals
MNR Loss Visual
MNR in PyTorch
MNR in Sentence Transformers
Results
Outro
Taught by
James Briggs