Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

CMU Neural Nets for NLP - Structured Prediction Basics

Graham Neubig via YouTube

Overview

Explore structured prediction basics in this comprehensive lecture from CMU's Neural Nets for NLP 2018 course. Delve into various prediction types, understand the importance of modeling output interactions, and learn about sequence labeling. Discover training methods for structured models, including local normalization and the structured perceptron algorithm. Compare perceptron and global normalization approaches, and examine the use of hinge loss in classifiers. Investigate cost-augmented hinge loss, costs over sequences, and cost-augmented decoding for Hamming loss. Conclude by exploring solutions for sampling mistakes during training, as presented by Ross et al. in 2010.

Syllabus

A Prediction Problem
Types of Prediction
Why Call it "Structured" Prediction?
Many Varieties of Structured Prediction!
Sequence Labeling w
Why Model Interactions in Output? . Consistency is important!
A Tagger Considering Output Structure movie
Training Structured Models
Local Normalization and
The Structured Perceptron Algorithm . An extremely simple way of training (non-probabilistic) global models . Find the one-best, and it's score is better than the correct answer adjust parameters to fix this
Contrasting Perceptron and Global Normalization
Structured Training and Pre-training
Hinge Loss for Any Classifier! We can swap cross-entropy for hinge loss anytime
Cost-augmented Hinge
Costs over Sequences
Cost-Augmented Decoding for Hamming Loss
Solution 1: Sample Mistakes in Training (Ross et al. 2010)

Taught by

Graham Neubig

Reviews

Start your review of CMU Neural Nets for NLP - Structured Prediction Basics

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.