Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

CMU Low Resource NLP Bootcamp 2020 - Neural Representation Learning

Graham Neubig via YouTube

Overview

Explore neural representation learning in natural language processing through this comprehensive lecture from the CMU Low Resource NLP Bootcamp 2020. Delve into various methods for learning neural representations of language, covering topics such as word and sentence representations, supervised and unsupervised learning approaches, and case studies on NNLM, Glove, ELMO, and BERT. Gain insights into different structural biases, clusters of approaches, and must-know points about RNN, CNN, and Transformer models. Learn when to use non-contextualized versus contextualized representations and understand the importance of software, models, and corpora in neural representation learning for NLP.

Syllabus

Neural Representation Learning in Natural Language Processing
Neural Representation Learning for NLP
What is the word representation ?
Why should we learn word representation
How can we get word representations?
Symbolic or Distributed?
Supervised or Unsupervised?
Count-based or Prediction-based?
Case Study: NNLM
Case Study: Glove
Case Study: ELMO
Case Study: BERT
Software, Model, Corpus
Using non-contextualized when ...
Using contextualized when ...
What is the sentence representation
Why do we need sentence representations
How can we learn sentence representations?
Different Structural Biases
Clusters of Approaches
Case Study: Must-know Points about RN
CNN: 1d and 2d Convolution
CNN: Narrow/Equal/Wide Convolution
CNN: Multiple Filter Convolution
Case Study: Must-know Points about Transforme

Taught by

Graham Neubig

Reviews

Start your review of CMU Low Resource NLP Bootcamp 2020 - Neural Representation Learning

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.