Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP 2020 - Machine Reading with Neural Nets

Graham Neubig via YouTube

Overview

Explore machine reading with neural networks in this comprehensive lecture from CMU's Neural Networks for NLP course. Dive into various machine reading datasets, methods for encoding context and multi-hop reasoning, and important caveats about dataset biases. Learn about multiple-choice questions, span selection, and cloze tasks in question answering formats. Examine case studies on popular datasets, including bAbl, CNN/Daily Mail, and Natural Questions. Discover basic models for document attention, such as Attentive Reader and Bidirectional Attention Flow. Investigate multi-step reasoning techniques, including memory networks and question decomposition. Gain insights into combining traditional computational semantics with neural approaches for tasks like numerical calculation and solving word problems.

Syllabus

Intro
What is Machine Reading?
Machine Reading Question Answering Formats
Multiple-choice Question Tasks
Span Selection
Cloze Questions
What is Necessary for Machine Reading?
All Datasets Have Their Biases
A Case Study: bAbl (Weston et al. 2014)
An Examination of CNN/ Daily Mail (Chen et al. 2015)
Adversarial Examples in Machine Reading (Jia and Liang 2017)
Adversarial Creation of New Datasets? (Zellers et al. 2018)
Natural Questions Kwiatkowski et al. 2019
A Basic Model for Document Attention
A First Try: Attentive Reader
Attention-over-attention
Bidirectional Attention Flow
Word Classification vs. Span Classification
Dynamic Span Decoder (Xiong et al. 2017)
Multi-step Reasoning Datasets
Softened, and Multi-layer Memory Networks (Sukhbaatar et al. 2015) • Use standard softmax attention, and multiple layers
When to Stop Reasoning?
Coarse-to-fine Question Answering (Choi et al. 2017)
Retrieval + Language Model
Explicit Question Decomposition for Multi-step Reasoning
Question Answering with Context (Choi et al. 2018, Reddy et al. 2018)
An Aside: Traditional Computational Semantics • Reasoning is something that traditional semantic representations are really good at!
Numerical Calculation
Machine Reading with Symbolic Operations . Can we explicitly incorporate numerical reasoning in machine reading?
Solving Word Problems w/ Symbolic Reasoning • Idea: combine semantic parsing (with explicit functions) and machine reading

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP 2020 - Machine Reading with Neural Nets

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.