Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Is GPL the Future of Sentence Transformers - Generative Pseudo-Labeling Deep Dive

James Briggs via YouTube

Overview

Dive deep into Generative Pseudo-Labeling (GPL) and its potential impact on sentence transformers in this comprehensive video tutorial. Explore the challenges of training sentence transformers and how GPL offers a promising solution for fine-tuning high-performance bi-encoder models using unlabeled text data. Learn about the core concepts of GPL, including query generation, negative mining, and pseudo-labeling, with practical code examples using the CORD-19 dataset. Discover the importance of these techniques in building intelligent language models capable of understanding and responding to natural language queries. Gain insights into the implementation of GPL, including the use of Margin MSE Loss and fine-tuning strategies. Conclude with a discussion on the future of sentence transformers and the potential applications of GPL across various industries.

Syllabus

Intro
Semantic Web and Other Uses
Why GPL?
How GPL Works
Query Generation
CORD-19 Dataset and Download
Query Generation Code
Query Generation is Not Perfect
Negative Mining
Negative Mining Implementation
Negative Mining Code
Pseudo-Labeling
Pseudo-Labeling Code
Importance of Pseudo-Labeling
Margin MSE Loss
MarginMSE Fine-tune Code
Choosing Number of Steps
Fast Evaluation
What's Next for Sentence Transformers?

Taught by

James Briggs

Reviews

Start your review of Is GPL the Future of Sentence Transformers - Generative Pseudo-Labeling Deep Dive

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.