Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP 2020 - Generating Trees Incrementally

Graham Neubig via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Learn about transition-based parsing, shift-reduce parsing with feed-forward networks, stack LSTMs, and semantic parsing in this comprehensive lecture from CMU's Neural Networks for NLP course. Explore the fundamentals of linguistic structure generation, including common types and semantic parsing tasks. Dive into shift-reduce examples, classification techniques, feature extraction, and the importance of tree structures in natural language processing. Examine recursive neural networks and their applications. Understand the core research questions for improving models, focusing on adding inductive biases to better capture program structures. Gain insights into supervised learning of semantic parsers and the key considerations in designing decoders that align with program structures.

Syllabus

Intro
Two Common Types of Linguistic Structure
Semantic Parsing: Another Representative Tree Generation Task
Shift Reduce Example
Classification for Shift-reduce
Making Classification Decisions
What Features to Extract?
Why Tree Structure?
Recursive Neural Networks (Socher et al. 2011)
Why Linguistic Structure?
Clarification about Meaning Representations (MRS) Machine-executable MRs (our focus today) executable programs to accomplish a task MRs for Semantic Annotation capture the semantics of natural language sentences
Core Research Question for Better Models How to add inductive blases to networks a to better capture the structure of programs?
Summary: Supervised Learning of Semantic Parsers Key Question design decoders to follow the structure of programs

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP 2020 - Generating Trees Incrementally

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.