Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

CMU Neural Nets for NLP 2020 - Multitask and Multilingual Learning

Graham Neubig via YouTube

Overview

Explore multitask and multilingual learning in natural language processing through this comprehensive lecture from CMU's Neural Networks for NLP course. Delve into the fundamentals of multi-task learning, examining various methods and objectives specific to NLP. Investigate multilingual learning techniques, including multi-lingual sequence-to-sequence models and pre-training approaches. Learn about challenges in fully multi-lingual learning, such as data balancing and cross-lingual transfer. Discover strategies for handling languages with different scripts and implementing zero-shot transfer to new languages. Gain insights into data creation and active learning for obtaining in-language training data.

Syllabus

Intro
Remember, Neural Nets are Feature Extractors!
Reminder: Types of Learning
Standard Multi-task Learning
Selective Parameter Adaptation • Sometimes it is better to adapt only some of the parameters
Different Layers for Different Tasks (Hashimoto et al. 2017)
Multiple Annotation Standards
Supervised/Unsupervised Adaptation
Supervised Domain Adaptation through Feature Augmentation
Unsupervised Learning through Feature Matching
Multi-lingual Sequence-to- sequence Models
Multi-lingual Pre-training
Difficulties in Fully Multi- lingual Learning
Data Balancing
Cross-lingual Transfer Learning
What if languages don't share the same script?
Zero-shot Transfer to New Languages
Data Creation, Active Learning . In order to get in-language training data, Active Learning (AL) can be used

Taught by

Graham Neubig

Reviews

Start your review of CMU Neural Nets for NLP 2020 - Multitask and Multilingual Learning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.