Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Continuous State Machines and Grammars for Linguistic Structure Prediction

Simons Institute via YouTube

Overview

Explore continuous state machines and grammars for linguistic structure prediction in this lecture from the Simons Institute. Delve into dependency parsing, recurrent neural networks, and stack LSTM parsers. Learn about global and greedy parsing paradigms, token and tree representations, and multilingual parsing approaches. Examine variations like adding semantics to parsers and using RNN grammars. Investigate phrase-structure trees and techniques for tree and string generation. Gain insights into cutting-edge research in natural language processing and computational linguistics through practical examples and experimental results presented by Noah Smith from the University of Washington.

Syllabus

Intro
Linguistic Structure Example: Dependencies
"Global" or "Graph-Based" Paradigm
Greedy Parsing with a Stack
Recurrent Neural Network
Stack RNN
Stack LSTM Parser
Token and Tree Representations
Learning
Results (Labeled Attachment Score)
Variations
Vanation: Many Languages, One Parser (Ammar et al., TACL 2016)
Stack LSTM MALOPA
Tiny Target Treebank: Results
Zero Target Treebank: Results
Variation: Add Semantics (Swayamcipta et al., CONLL 2016)
Linguistic Structure Example: Semantic Dependencies
Variation: RNN Grammars (Dyer et al., NAACL 2016, Kuncoro et al., EACL 2017)
Another Linguistic Structure Example: Phrase-Structure Tree
Better Dependency Parsers?
Tree & String Generation with a Stack
Additional Details
Conclusions

Taught by

Simons Institute

Reviews

Start your review of Continuous State Machines and Grammars for Linguistic Structure Prediction

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.