Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP 2021 - Language Modeling, Efficiency/Training Tricks

Graham Neubig via YouTube

Overview

Explore a comprehensive lecture on language modeling and neural network training techniques for natural language processing. Delve into feed-forward neural network language models, overfitting prevention methods, mini-batching, automatic optimization techniques, and various optimizers. Learn how to measure language model performance using accuracy, likelihood, and perplexity metrics. Gain insights into efficient training tricks, regularization techniques, and LSTM language model optimization. Enhance your understanding of neural networks for NLP through in-depth explanations and practical examples provided by Graham Neubig in this CMU CS 11-747 course lecture.

Syllabus

Intro
Language Modeling: Calculating the Probability of a Sentence
Count-based Language Models
A Refresher on Evaluation
Problems and Solutions?
An Alternative: Featurized Models
A Computation Graph View
A Note: "Lookup"
Training a Model
Parameter Update
Unknown Words
Evaluation and Vocabulary
Linear Models can't Learn Feature Combinations
Neural Language Models . (See Bengio et al. 2004)
Tying Input/Output Embeddings
Standard SGD
SGD With Momentum
Adagrad
Adam
Shuffling the Training Data
Neural nets have lots of parameters, and are prone to overfitting
Efficiency Tricks: Mini-batching
Minibatching
Manual Mini-batching
Mini-batched Code Example
Automatic Mini-batching!
Code-level Optimization . eg. TorchScript provides a restricted representation of a PyTorch module that can be run efficiently in C++
Regularizing and Optimizing LSTM Language Models (Merity et al. 2017)
In-class Discussion

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP 2021 - Language Modeling, Efficiency/Training Tricks

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.