Completed
Cross-lingual transfer
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
CMU Multilingual NLP 2020 - Multilingual Training and Cross-Lingual Transfer
Automatically move to the next video in the Classroom when playback concludes
- 1 Many languages are left behind
- 2 Roadmap
- 3 Cross-lingual transfer
- 4 Supporting multiple languages could be tedious
- 5 Combining the two methods
- 6 Use case: covid-19 response
- 7 Rapid adaptation of massive multilingual models
- 8 Meta-learning for multilingual training
- 9 Multilingual NMT
- 10 Improve zero-shot NMT
- 11 Align multilingual representation
- 12 Zero-shot transfer for pretrained representations
- 13 Massively multilingual training
- 14 Training data highly imbalanced
- 15 Heuristic Sampling of Data
- 16 Learning to balance data
- 17 Problem: sometimes underperforms bilingual model
- 18 Multilingual Knowledge Distillation
- 19 Adding Language-specific layers
- 20 Problem: one-to-many transfer
- 21 Problem: multilingual
- 22 evaluation
- 23 Discussion question