Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

CMU Advanced NLP: Recurrent Neural Networks

Graham Neubig via YouTube

Overview

Explore recurrent neural networks in this advanced natural language processing lecture from Carnegie Mellon University. Delve into the intricacies of long-distance dependencies, the Winigrad Schema Challenge, and various types of predictions. Examine the structure and functionality of recurrent networks, addressing the vanishing gradient problem and introducing Long Short-Term Memory (LSTM) networks. Analyze the strengths and weaknesses of recurrence in sentence modeling, and discover the potential of pre-training techniques for RNNs. Gain insights into efficiency considerations and optimization strategies for these powerful deep learning models.

Syllabus

Intro
Long Distance Dependencies
Winigrad Schema Challenge
Types of Prediction
Unconditioned vs Condition Prediction
Types of Unconditioned Prediction
Types of Condition Prediction
Recurrent Neural Networks
Vanishing Gradients
LTSM
RNNs
Other examples
Efficiency
Optimization

Taught by

Graham Neubig

Reviews

Start your review of CMU Advanced NLP: Recurrent Neural Networks

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.