Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Fine-Tune High Performance Sentence Transformers With Multiple Negatives Ranking

James Briggs via YouTube

Overview

Explore the process of fine-tuning high-performance sentence transformers using Multiple Negatives Ranking (MNR) loss in this 37-minute video tutorial. Learn about the evolution of transformer-produced sentence embeddings, from BERT cross-encoders to SBERT and beyond. Discover how MNR loss has revolutionized the field, enabling newer models to quickly outperform their predecessors. Dive into the implementation of MNR loss for fine-tuning sentence transformers, covering both a detailed approach and a simplified method using the sentence-transformers library. Gain insights into NLI training data, preprocessing techniques, and visual representations of SBERT fine-tuning and MNR loss. Compare results and understand the impact of this advanced technique on sentence embedding quality.

Syllabus

Intro
NLI Training Data
Preprocessing
SBERT Finetuning Visuals
MNR Loss Visual
MNR in PyTorch
MNR in Sentence Transformers
Results
Outro

Taught by

James Briggs

Reviews

Start your review of Fine-Tune High Performance Sentence Transformers With Multiple Negatives Ranking

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.