Overview
Syllabus
Neural Representation Learning in Natural Language Processing
Neural Representation Learning for NLP
What is the word representation ?
Why should we learn word representation
How can we get word representations?
Symbolic or Distributed?
Supervised or Unsupervised?
Count-based or Prediction-based?
Case Study: NNLM
Case Study: Glove
Case Study: ELMO
Case Study: BERT
Software, Model, Corpus
Using non-contextualized when ...
Using contextualized when ...
What is the sentence representation
Why do we need sentence representations
How can we learn sentence representations?
Different Structural Biases
Clusters of Approaches
Case Study: Must-know Points about RN
CNN: 1d and 2d Convolution
CNN: Narrow/Equal/Wide Convolution
CNN: Multiple Filter Convolution
Case Study: Must-know Points about Transforme
Taught by
Graham Neubig