Completed
Simple: Infinitely Pass State
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Neural Nets for NLP 2021 - Document-Level Models
Automatically move to the next video in the Classroom when playback concludes
- 1 Some NLP Tasks we've Handled
- 2 Some Connections to Tasks over Documents
- 3 Document Level Language Modeling
- 4 Remember: Modeling using Recurrent Networks
- 5 Simple: Infinitely Pass State
- 6 Separate Encoding for Coarse- grained Document Context
- 7 Self-attention/Transformers Across Sentences
- 8 Transformer-XL: Truncated BPTT+Transformer
- 9 Adaptive Span Transformers
- 10 Reformer: Efficient Adaptively Sparse Attention
- 11 How to Evaluate Document- level Models?
- 12 Document Problems: Entity Coreference
- 13 Mention(Noun Phrase) Detection
- 14 Components of a Coreference Model
- 15 Coreference Models:Instances
- 16 Mention Pair Models
- 17 Entity Models: Entity-Mention Models
- 18 Advantages of Neural Network Models for Coreference
- 19 End-to-End Neural Coreference (Span Model)
- 20 End-to-End Neural Coreference (Coreference Model)
- 21 Using Coreference in Neural Models
- 22 Discourse Parsing w/ Attention- based Hierarchical Neural Networks
- 23 Uses of Discourse Structure in Neural Models