Completed
Plethora of Tasks in NLP
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Neural Nets for NLP 2019: Sentence and Contextualized Word Representations
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Goal for Today
- 3 Where would we need/use Sentence Representations?
- 4 Sentence Classification
- 5 Paraphrase Identification (Dolan and Brockett 2005) • Identify whether A and B mean the same thing
- 6 Textual Entailment (Dagan et al. 2006, Marelli et al. 2014)
- 7 Model for Sentence Pair Processing
- 8 Types of Learning
- 9 Plethora of Tasks in NLP
- 10 Rule of Thumb 2
- 11 Standard Multi-task Learning
- 12 Thinking about Multi-tasking, and Pre-trained Representations
- 13 General Model Overview
- 14 Language Model Transfer
- 15 End-to-end vs. Pre-training
- 16 Context Prediction Transfer (Skip-thought Vectors) (Kiros et al. 2015)
- 17 Paraphrase ID Transfer (Wieting et al. 2015)
- 18 Large Scale Paraphrase Data (ParaNMT-50MT) (Wieting and Gimpel 2018)
- 19 Entailment Transfer (InferSent) (Conneau et al. 2017)
- 20 Bi-directional Language Modeling Objective (ELMO)
- 21 Masked Word Prediction (BERT)