Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP 2017 - Neural Semantic Parsing

Graham Neubig via YouTube

Overview

Explore neural semantic parsing in this comprehensive lecture from CMU's Neural Networks for NLP course. Delve into graph-based parsing, minimum spanning tree parsing, and structured training techniques. Learn about dynamic programming methods for phrase structure parsing and reranking. Access accompanying slides and code examples to reinforce understanding of key concepts. Gain insights into semantic parsing approaches, including sequence-to-sequence models and tree-based parsing models. Examine various meaning representations such as first-order logic and Abstract Meaning Representation. Investigate syntax-driven semantic parsing, CCG parsing, and semantic role labeling with neural models.

Syllabus

Intro
Tree Structures of Syntax
Representations of Semantics
Meaning Representations
Example Special-purpose Representations
Example Query Tasks
A First Attempt: Sequence-to- sequence Models (Jia and Liang 2016)
A Better Attempt: Tree-based Parsing Models • Generate from top-down using hierarchical sequence- to-sequence model (Dong and Lapata 2016)
Meaning Representation Desiderata (Jurafsky and Martin 17.1)
First-order Logic
Abstract Meaning Representation (Banarescu et al. 2013)
Syntax-driven Semantic Parsing
CCG and CCG Parsing
Parsing to Graph Structures
Semantic Role Labeling (Gildea and Jurafsky 2002)
Neural Models for Semantic Role Labeling

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP 2017 - Neural Semantic Parsing

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.