Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP 2017 - Transition-Based Dependency Parsing

Graham Neubig via YouTube

Overview

Explore transition-based dependency parsing in this comprehensive lecture from CMU's Neural Networks for NLP course. Delve into the fundamentals of transition-based parsing, including shift-reduce parsing with feed-forward networks and stack LSTMs. Learn about a simple alternative approach using linearized trees. Gain insights into various parsing techniques, feature extraction methods, and the importance of tree structures in natural language processing. Examine practical code examples and follow along with detailed slides to reinforce your understanding of these advanced NLP concepts.

Syllabus

Intro
Two Types of Linguistic Structure
Arc Standard Shift-Reduce Parsing (Yamada & Matsumoto 2003, Nivre 2003)
Shift Reduce Example
Classification for Shift-reduce
Making Classification Decisions
What Features to Extract?
Non-linear Function: Cube Function
Why Tree Structure?
Tree-structured LSTM (Tai et al. 2015)
Encoding Parsing Configurations w/ RNNS
A Simple Approximation: Linearized Trees (Vinyals et al. 2015)
Recursive Neural Networks (Socher et al. 2011)

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP 2017 - Transition-Based Dependency Parsing

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.