Overview
Syllabus
Some NLP Tasks we've Handled
Some Connections to Tasks over Documents
Document Level Language Modeling
Remember: Modeling using Recurrent Networks
Simple: Infinitely Pass State
Separate Encoding for Coarse- grained Document Context
Self-attention/Transformers Across Sentences
Transformer-XL: Truncated BPTT+Transformer
Adaptive Span Transformers
Reformer: Efficient Adaptively Sparse Attention
How to Evaluate Document- level Models?
Document Problems: Entity Coreference
Mention(Noun Phrase) Detection
Components of a Coreference Model
Coreference Models:Instances
Mention Pair Models
Entity Models: Entity-Mention Models
Advantages of Neural Network Models for Coreference
End-to-End Neural Coreference (Span Model)
End-to-End Neural Coreference (Coreference Model)
Using Coreference in Neural Models
Discourse Parsing w/ Attention- based Hierarchical Neural Networks
Uses of Discourse Structure in Neural Models
Taught by
Graham Neubig