CMU Advanced NLP: Pre-training Methods
Graham Neubig via YouTube
Overview
Syllabus
Introduction
Neural Networks
Goals
Multitasking learning
Level of variety
Multitasking
Related Tasks
Multitask Learning
Pretraining
Pretraining Methods
Sentence Representations
Sentence Pair Classification
Sentence Pair Classification Examples
Semantic Similarity Relatedness
Textual entailment
Methods
Autoencoder
Skip thought vectors
Paraphrasebased contrastive learning
Largescale paraphrasing
Multitasking entailment
Supervised training
Sentence transformers
Context effect
Masking
Taught by
Graham Neubig