Completed
Neural Sequence Models
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Neural Nets for NLP 2020 - Convolutional Neural Networks for Text
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Outline
- 3 An Example Prediction Problem: Sentiment Classification
- 4 Continuous Bag of Words (CBOW)
- 5 Deep CBOW
- 6 Why Bag of n-grams?
- 7 What Problems
- 8 Neural Sequence Models
- 9 Definition of Convolution
- 10 Intuitive Understanding
- 11 Priori Entailed by CNNS
- 12 Concept: 2d Convolution
- 13 Concept: Stride
- 14 Concept: Padding
- 15 Three Types of Convolutions
- 16 Concept: Multiple Filters
- 17 Concept: Pooling
- 18 Overview of the Architecture
- 19 Embedding Layer
- 20 Conv. Layer
- 21 Pooling Layer
- 22 Output Layer
- 23 Dynamic Filter CNN (e.g. Brabandere et al. 2016)
- 24 CNN Applications
- 25 NLP (Almost) from Scratch (Collobert et al. 2011)
- 26 CNN-RNN-CRF for Tagging (Ma et al. 2016) . A classic framework and de-facto standard for
- 27 Why Structured Convolution?
- 28 Understand the design philosophy of a model
- 29 Structural Bias
- 30 component entail?