Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP - Structured Prediction Basics

Graham Neubig via YouTube

Overview

Explore structured prediction basics in this lecture from CMU's Neural Networks for NLP course. Delve into the Structured Perceptron algorithm, structured max-margin objectives, and simple remedies to exposure bias. Learn about various types of prediction, the importance of modeling output interactions, and training methods for structured models. Examine local normalization, global normalization, and cost-augmented decoding for Hamming loss. Gain insights into sequence labeling, tagger considerations for output structure, and the challenges associated with structured hinge loss.

Syllabus

Intro
A Prediction Problem
Types of Prediction
Why Call it "Structured" Prediction?
Many Varieties of Structured Prediction!
Sequence Labeling as
Sequence Labeling w
Why Model Interactions in Output? . Consistency is important!
A Tagger Considering Output Structure
Training Structured Models
Local Normalization and
The Structured Perceptron Algorithm . An extremely simple way of training (non-probabilistic) global models
Structured Perceptron Loss
Contrasting Perceptron and Global Normalization • Globally normalized probabilistic model
Structured Training and Pre-training
Cost-Augmented Decoding for Hamming Loss • Hamming loss is decomposable over each word • Solution: add a score - Cost to each incorrect choice during search
What's Wrong w/ Structured Hinge Loss?

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP - Structured Prediction Basics

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.