Completed
Intro
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Neural Nets for NLP 2020 - Machine Reading with Neural Nets
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 What is Machine Reading?
- 3 Machine Reading Question Answering Formats
- 4 Multiple-choice Question Tasks
- 5 Span Selection
- 6 Cloze Questions
- 7 What is Necessary for Machine Reading?
- 8 All Datasets Have Their Biases
- 9 A Case Study: bAbl (Weston et al. 2014)
- 10 An Examination of CNN/ Daily Mail (Chen et al. 2015)
- 11 Adversarial Examples in Machine Reading (Jia and Liang 2017)
- 12 Adversarial Creation of New Datasets? (Zellers et al. 2018)
- 13 Natural Questions Kwiatkowski et al. 2019
- 14 A Basic Model for Document Attention
- 15 A First Try: Attentive Reader
- 16 Attention-over-attention
- 17 Bidirectional Attention Flow
- 18 Word Classification vs. Span Classification
- 19 Dynamic Span Decoder (Xiong et al. 2017)
- 20 Multi-step Reasoning Datasets
- 21 Softened, and Multi-layer Memory Networks (Sukhbaatar et al. 2015) • Use standard softmax attention, and multiple layers
- 22 When to Stop Reasoning?
- 23 Coarse-to-fine Question Answering (Choi et al. 2017)
- 24 Retrieval + Language Model
- 25 Explicit Question Decomposition for Multi-step Reasoning
- 26 Question Answering with Context (Choi et al. 2018, Reddy et al. 2018)
- 27 An Aside: Traditional Computational Semantics • Reasoning is something that traditional semantic representations are really good at!
- 28 Numerical Calculation
- 29 Machine Reading with Symbolic Operations . Can we explicitly incorporate numerical reasoning in machine reading?
- 30 Solving Word Problems w/ Symbolic Reasoning • Idea: combine semantic parsing (with explicit functions) and machine reading