Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore search-based structured prediction in natural language processing through this comprehensive lecture from CMU's Neural Networks for NLP course. Delve into the Structured Perceptron algorithm, examining its simplicity in training non-probabilistic global models. Contrast perceptron and global normalization approaches, and investigate structured training techniques. Learn about cost-augmented hinge loss and its application to sequence modeling. Gain insights into addressing exposure bias with simple remedies. Understand the intricacies of structured max-margin objectives and their role in NLP tasks. Discover how corrupt training data impacts model performance and explore strategies to mitigate its effects.
Syllabus
Intro
Types of Prediction
Two Methods for Approximation
Structured Perceptron Loss
The Structured Perceptron Algorithm . An extremely simple way of training (non-probabilistic) global models . Find the one-best, and if it's score is better than the correct answer adjust parameters to fix this
Contrasting Perceptron and Global Normalization • Globally normalized probabilistic model
Structured Training and Pre-training
Cost-augmented Hinge
Costs over Sequences
Corrupt Training Data
Taught by
Graham Neubig