Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

CMU Advanced NLP: Pre-training Methods

Graham Neubig via YouTube

Overview

Explore advanced natural language processing techniques in this comprehensive lecture on pre-training methods. Delve into multi-task learning concepts, sentence embeddings, BERT and its variants, and alternative language modeling objectives. Gain insights into sentence representations, semantic similarity, textual entailment, and various pretraining approaches including autoencoders, skip-thought vectors, and paraphrase-based contrastive learning. Examine the impact of context and masking in language models, and understand the applications of these techniques in real-world NLP tasks.

Syllabus

Introduction
Neural Networks
Goals
Multitasking learning
Level of variety
Multitasking
Related Tasks
Multitask Learning
Pretraining
Pretraining Methods
Sentence Representations
Sentence Pair Classification
Sentence Pair Classification Examples
Semantic Similarity Relatedness
Textual entailment
Methods
Autoencoder
Skip thought vectors
Paraphrasebased contrastive learning
Largescale paraphrasing
Multitasking entailment
Supervised training
Sentence transformers
Context effect
Masking

Taught by

Graham Neubig

Reviews

Start your review of CMU Advanced NLP: Pre-training Methods

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.