Completed
Data Balancing
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
CMU Neural Nets for NLP 2020 - Multitask and Multilingual Learning
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Remember, Neural Nets are Feature Extractors!
- 3 Reminder: Types of Learning
- 4 Standard Multi-task Learning
- 5 Selective Parameter Adaptation • Sometimes it is better to adapt only some of the parameters
- 6 Different Layers for Different Tasks (Hashimoto et al. 2017)
- 7 Multiple Annotation Standards
- 8 Supervised/Unsupervised Adaptation
- 9 Supervised Domain Adaptation through Feature Augmentation
- 10 Unsupervised Learning through Feature Matching
- 11 Multi-lingual Sequence-to- sequence Models
- 12 Multi-lingual Pre-training
- 13 Difficulties in Fully Multi- lingual Learning
- 14 Data Balancing
- 15 Cross-lingual Transfer Learning
- 16 What if languages don't share the same script?
- 17 Zero-shot Transfer to New Languages
- 18 Data Creation, Active Learning . In order to get in-language training data, Active Learning (AL) can be used