CMU Neural Nets for NLP 2020 - Multitask and Multilingual Learning

CMU Neural Nets for NLP 2020 - Multitask and Multilingual Learning

Graham Neubig via YouTube Direct link

Intro

1 of 18

1 of 18

Intro

Class Central Classrooms beta

YouTube playlists curated by Class Central.

Classroom Contents

CMU Neural Nets for NLP 2020 - Multitask and Multilingual Learning

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Remember, Neural Nets are Feature Extractors!
  3. 3 Reminder: Types of Learning
  4. 4 Standard Multi-task Learning
  5. 5 Selective Parameter Adaptation • Sometimes it is better to adapt only some of the parameters
  6. 6 Different Layers for Different Tasks (Hashimoto et al. 2017)
  7. 7 Multiple Annotation Standards
  8. 8 Supervised/Unsupervised Adaptation
  9. 9 Supervised Domain Adaptation through Feature Augmentation
  10. 10 Unsupervised Learning through Feature Matching
  11. 11 Multi-lingual Sequence-to- sequence Models
  12. 12 Multi-lingual Pre-training
  13. 13 Difficulties in Fully Multi- lingual Learning
  14. 14 Data Balancing
  15. 15 Cross-lingual Transfer Learning
  16. 16 What if languages don't share the same script?
  17. 17 Zero-shot Transfer to New Languages
  18. 18 Data Creation, Active Learning . In order to get in-language training data, Active Learning (AL) can be used

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.