Completed
Unsupervised Word Translation: Adversarial Training
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
CMU Multilingual NLP - Unsupervised Translation
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Conditional Text Generation
- 3 Modeling: Conditional Language Models
- 4 What if we don't have parallel data?
- 5 Can't we just collect/generate the data?
- 6 Outline
- 7 Initialization: Unsupervised Word Translation
- 8 Unsupervised Word Translation: Adversarial Training
- 9 Back-translation
- 10 One slide primer on phrase-based statistical MT
- 11 Unsupervised Statistical MT
- 12 Bidirectional Modeling . Model: same encoder decoder used for both languages Initialize with cross-lingual word embeddings
- 13 Unsupervised MT: Training Objective 1
- 14 How does it work?
- 15 Unsupervised NMT: Training Objective 3
- 16 In summary
- 17 When Does Unsupervised Machine Translation Work?
- 18 Reasons for this poor performance
- 19 Open Problems
- 20 Better Initialization: Cross Lingual Language Models
- 21 Better Initialization: Multilingual BART
- 22 Better Initialization: Masked Sequence to Sequence Model (MASS) • Encoder-decoder formulation of masked language modelling
- 23 Multilingual Unsupervised MT
- 24 Multilingual UNMT
- 25 How practical is the strict unsupervised scenario
- 26 Related Area: Style Transfer
- 27 Discussion Question