Completed
CNN: Narrow/Equal/Wide Convolution
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
CMU Low Resource NLP Bootcamp 2020 - Neural Representation Learning
Automatically move to the next video in the Classroom when playback concludes
- 1 Neural Representation Learning in Natural Language Processing
- 2 Neural Representation Learning for NLP
- 3 What is the word representation ?
- 4 Why should we learn word representation
- 5 How can we get word representations?
- 6 Symbolic or Distributed?
- 7 Supervised or Unsupervised?
- 8 Count-based or Prediction-based?
- 9 Case Study: NNLM
- 10 Case Study: Glove
- 11 Case Study: ELMO
- 12 Case Study: BERT
- 13 Software, Model, Corpus
- 14 Using non-contextualized when ...
- 15 Using contextualized when ...
- 16 What is the sentence representation
- 17 Why do we need sentence representations
- 18 How can we learn sentence representations?
- 19 Different Structural Biases
- 20 Clusters of Approaches
- 21 Case Study: Must-know Points about RN
- 22 CNN: 1d and 2d Convolution
- 23 CNN: Narrow/Equal/Wide Convolution
- 24 CNN: Multiple Filter Convolution
- 25 Case Study: Must-know Points about Transforme