Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore advanced natural language processing techniques for document-level modeling in this comprehensive lecture from CMU's Advanced NLP course. Delve into extracting features from long sequences, coreference resolution, and discourse parsing. Learn about various encoding methods, including self-attention and Transformer models, as well as efficient approaches like sparse and adaptive span transformers. Discover techniques for entity coreference, including mention detection and pair models. Examine neural models for discourse parsing and gain insights into evaluating document-level language models. Access additional resources and materials through the provided class website to further enhance your understanding of these advanced NLP concepts.
Syllabus
Intro
Documentlevel Language Modeling
Recurrent Neural Networks
Encoding Methods
Self Attendance
Transformer Excel
Compressive Transformer
Sparse Transformer
Adaptive Span Transformer
Sparse Computations
Reformer
Low Rank Approximation
Evaluation
Entity Coreference
Mention Detection
Components
Instances
Pair Models
Coreference
Coreference model
Coreference models
Discourse parsing
Neural models
Taught by
Graham Neubig