Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP 2020 - Search-Based Structured Prediction

Graham Neubig via YouTube

Overview

Explore search-based structured prediction in natural language processing through this comprehensive lecture from CMU's Neural Networks for NLP course. Delve into the Structured Perceptron algorithm, examining its simplicity in training non-probabilistic global models. Contrast perceptron and global normalization approaches, and investigate structured training techniques. Learn about cost-augmented hinge loss and its application to sequence modeling. Gain insights into addressing exposure bias with simple remedies. Understand the intricacies of structured max-margin objectives and their role in NLP tasks. Discover how corrupt training data impacts model performance and explore strategies to mitigate its effects.

Syllabus

Intro
Types of Prediction
Two Methods for Approximation
Structured Perceptron Loss
The Structured Perceptron Algorithm . An extremely simple way of training (non-probabilistic) global models . Find the one-best, and if it's score is better than the correct answer adjust parameters to fix this
Contrasting Perceptron and Global Normalization • Globally normalized probabilistic model
Structured Training and Pre-training
Cost-augmented Hinge
Costs over Sequences
Corrupt Training Data

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP 2020 - Search-Based Structured Prediction

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.