Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP 2017 - Parsing With Dynamic Programs

Graham Neubig via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore parsing with dynamic programs in this lecture from CMU's Neural Networks for NLP course. Delve into graph-based parsing, minimum spanning tree parsing, and structured training techniques. Learn about dynamic programming methods for phrase structure parsing and reranking approaches. Examine algorithms like Tulio Edmunds and IceNurse, and understand the transition from traditional to neural models. Investigate global probabilistic training, CKY and Viterbi algorithms, and Conditional Random Fields (CRFs) for parsing. Gain insights into neural CRFs, structured inference, and recursive neural networks for parsing tasks.

Syllabus

Introduction
Linguistic Structure
Dynamic Programming Based Models
Minimum Spanning Tree
Graph Based vs Transition Based
Tulio Edmunds Algorithm
IceNurse Algorithm
Quiz
Before Neural Nets
Higher Order Dependency Parsing
Neural Models
Motivation
Model
Example
Global probabilistic training
Code example
Algorithms
Phrase Structures
Parsing vs Tagging
Hyper Graph Edges
Scoring Edges
CKY Algorithm
Viterbi Algorithm
Over Graphs
CRF
CRF Example
CRF Over Trees
Neural CRF
Inference
Parsing
Structured Inference
Recursive Neural Networks
Rear Inking
Rear Inking Results
Next Time

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP 2017 - Parsing With Dynamic Programs

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.