Completed
Discourse Parsing w/ Attention- based Hierarchical Neural Networks
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Neural Nets for NLP 2019 - Document Level Models
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Some Connections to Tasks over Documents
- 3 Document Level Language Modeling
- 4 What Context to Incorporate?
- 5 How to Evaluate Document Coherence Models?
- 6 Mention(Noun Phrase) Detection
- 7 Components of a Coreference Model . Like a traditional machine learning model
- 8 Coreference Models:Instances
- 9 Mention Pair Models
- 10 Entity Models
- 11 Advantages of Neural Network Models for Coreference
- 12 Coreference Resolution w/ Entity- Level Distributed Representations
- 13 End-to-End Neural Coreference (Span Model)
- 14 End-to-End Neural Coreference (Coreference Model)
- 15 Using Coreference in Neural Models
- 16 Document Problems: Discourse Parsing
- 17 Shift-reduce Parsing Discourse Structure Parsing w/ Distributed Representations (Ji and Eisenstein 2014) . Shift-reduce parser with features from 2 stack elements and queue element
- 18 Discourse Parsing w/ Attention- based Hierarchical Neural Networks
- 19 Uses of Discourse Structure in Neural Models