Neural Nets for NLP - Structured Prediction Basics

Neural Nets for NLP - Structured Prediction Basics

Graham Neubig via YouTube Direct link

Intro

1 of 17

1 of 17

Intro

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP - Structured Prediction Basics

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 A Prediction Problem
  3. 3 Types of Prediction
  4. 4 Why Call it "Structured" Prediction?
  5. 5 Many Varieties of Structured Prediction!
  6. 6 Sequence Labeling as
  7. 7 Sequence Labeling w
  8. 8 Why Model Interactions in Output? . Consistency is important!
  9. 9 A Tagger Considering Output Structure
  10. 10 Training Structured Models
  11. 11 Local Normalization and
  12. 12 The Structured Perceptron Algorithm . An extremely simple way of training (non-probabilistic) global models
  13. 13 Structured Perceptron Loss
  14. 14 Contrasting Perceptron and Global Normalization • Globally normalized probabilistic model
  15. 15 Structured Training and Pre-training
  16. 16 Cost-Augmented Decoding for Hamming Loss • Hamming loss is decomposable over each word • Solution: add a score - Cost to each incorrect choice during search
  17. 17 What's Wrong w/ Structured Hinge Loss?

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.