Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP - Recurrent Networks for Sentence or Language Modeling

Graham Neubig via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore recurrent neural networks for sentence and language modeling in this comprehensive lecture from CMU's Neural Networks for NLP course. Dive into the structure and capabilities of RNNs, understand the vanishing gradient problem and how LSTMs address it, and examine the strengths and weaknesses of recurrence in sentence modeling. Learn about pre-training techniques for RNNs and gain insights into handling long sequences and long-distance dependencies in language processing. Discover practical applications like language modeling and sentence representation through detailed examples and explanations.

Syllabus

Intro
Why Model Sentence Pairs?
Siamese Network (Bromley et al. 1993)
Convolutional Matching Model (Hu et al. 2014) • Concatenate sentences into a 30 tensor and perform convolution
Convolutional Features + Matrix-based Pooling in and Schutze 2015
NLP and Sequential Data
Long-distance Dependencies in Language
Can be Complicated!
Recurrent Neural Networks (Elman 1990)
Unrolling in Time • What does processing a sequence look like?
What Can RNNs Do?
Representing Sentences
e.g. Language Modeling
RNNLM Example: Loss Calculation and State Update
Vanishing Gradient • Gradients decrease as they get pushed back
LSTM Structure
What can LSTMs Learn? (2) (Shi et al. 2016, Radford et al. 2017) Count length of sentence
Handling Long Sequences

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP - Recurrent Networks for Sentence or Language Modeling

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.