Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Fine-Tune Sentence Transformers the OG Way - With NLI Softmax Loss

James Briggs via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the fine-tuning process of sentence transformers using Natural Language Inference (NLI) softmax loss in this comprehensive video tutorial. Learn about the training approach used in the first sentence-BERT (SBERT) model for producing sentence embeddings. Dive into the preprocessing of NLI data, implement the PyTorch process, and utilize the Sentence-Transformers library. Examine the results and understand why this method, while historically significant, has been superseded by more advanced techniques. Gain insights into applications such as semantic textual similarity, clustering, and information retrieval using concept-based embeddings.

Syllabus

Intro
NLI Fine-tuning
Softmax Loss Training Overview
Preprocessing NLI Data
PyTorch Process
Using Sentence-Transformers
Results
Outro

Taught by

James Briggs

Reviews

Start your review of Fine-Tune Sentence Transformers the OG Way - With NLI Softmax Loss

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.