Neural Nets for NLP 2017 - Multilingual and Multitask Learning

Neural Nets for NLP 2017 - Multilingual and Multitask Learning

Graham Neubig via YouTube Direct link

Rule of Thumb 1: Multitask to Increase Data

5 of 19

5 of 19

Rule of Thumb 1: Multitask to Increase Data

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP 2017 - Multilingual and Multitask Learning

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Remember, Neural Nets are Feature Extractors!
  3. 3 Types of Learning
  4. 4 Plethora of Tasks in NLP
  5. 5 Rule of Thumb 1: Multitask to Increase Data
  6. 6 Rule of Thumb 2
  7. 7 Standard Multi-task Learning
  8. 8 Examples of Pre-training Encoders
  9. 9 Regularization for Pre-training (e.g. Barone et al. 2017)
  10. 10 Selective Parameter Adaptation
  11. 11 Soft Parameter Tying
  12. 12 Supervised/Unsupervised Adaptation
  13. 13 Supervised Domain Adaptation through Feature Augmentation
  14. 14 Unsupervised Learning through Feature Matching
  15. 15 Multilingual Inputs
  16. 16 Multilingual Structured Prediction/ Multilingual Outputs
  17. 17 Teacher-student Networks for Multilingual Adaptation (Chen et al. 2017)
  18. 18 Types of Multi-tasking
  19. 19 Multiple Annotation Standards

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.