Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP 2020 - Document Level Models

Graham Neubig via YouTube

Overview

Learn about document-level natural language processing in this lecture from CMU's Neural Networks for NLP course. Explore document-level language modeling techniques, including recurrent networks and self-attention mechanisms. Dive into entity coreference resolution, covering mention detection, mention pair models, and entity-level distributed representations. Examine discourse parsing using attention-based hierarchical neural networks. Gain insights into applying these advanced NLP concepts to tasks spanning multiple sentences and entire documents.

Syllabus

Some NLP Tasks we've Handled
Some Connections to Tasks over Documents
Document Level Language Modeling
Remember: Modeling using Recurrent Networks
Separate Encoding for Coarse- grained Document Context (Mikolov & Zweig 2012)
What Context to Incorporate?
Self-attention/Transformers Across Sentences
Document Problems: Entity Coreference
Mention(Noun Phrase) Detection
Components of a Coreference Model
Coreference Models:Instances
Mention Pair Models
Entity Models: Entity-Mention Models
Advantages of Neural Network Models for Coreference
Coreference Resolution w/ Entity- Level Distributed Representations
Deep Reinforcement Learning for Mention-Ranking Coreference Models
End-to-End Neural Coreference (Span Model)
End-to-End Neural Coreference (Coreference Model)
Using Coreference in Neural Models
Discourse Parsing w/ Attention- based Hierarchical Neural Networks
Uses of Discourse Structure

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP 2020 - Document Level Models

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.