Today Unsupervised Sentence Transformers, Tomorrow Skynet - How TSDAE Works
James Briggs via YouTube
Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the world of unsupervised sentence transformers in this comprehensive 44-minute video tutorial. Dive into the challenges of adapting pretrained transformers for meaningful sentence vector production, especially in domains and languages with limited labeled data. Learn about the Transformer-based Sequential Denoising Auto-Encoder (TSDAE) approach as an alternative to supervised fine-tuning methods. Discover the process of data preparation, model initialization, training, and evaluation for TSDAE. Compare the effectiveness of TSDAE with supervised methods and understand its advantages in scenarios with scarce labeled data. Gain insights into language embedding importance, supervised techniques like Natural Language Inference and Semantic Textual Similarity, and the potential of multilingual training.
Syllabus
Why Language Embedding Matters
Supervised Methods
Natural Language Inference
Semantic Textual Similarity
Multilingual Training
TSDAE Unsupervised
Data Preparation
Initialize Model
Model Training
NLTK Error
Evaluation
TSDAE vs Supervised Methods
Why TSDAE is Cool
Taught by
James Briggs