Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

CMU Neural Nets for NLP 2018 - Document-Level Models

Graham Neubig via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore document-level models in natural language processing through this comprehensive lecture from Carnegie Mellon University's Neural Networks for NLP course. Delve into topics such as language modeling, long-term dependencies, topic modeling, and coreference resolution. Learn about entity mention models, entity-centric models, and complex features in coreference. Examine discourse parsing techniques, including shift-reduce parsers and recursive models. Understand the importance of coreference in language modeling and its applications in discourse analysis. Investigate document classification methods and their accuracy. Gain insights into advanced NLP concepts and techniques for processing and analyzing entire documents.

Syllabus

Documentlevel Models
Recap
Tasks over documents
Language modeling
Longterm dependencies
Topic modeling
Evaluation
Coreference
Mention Detection
Model Components
Entity Mention Models
EntityCentric Models
Complex Features
Advantages
Coreference Resolution
Questions
Cluster level features
Model overview
Inference model
Why do I need coreference
Language modeling with coreference
Discourse parsing
Course parsing
Shift reduce parser
Discrete features
Recursive models
Complex models
Discourse relations
Discourse parse
Discourse dependency structure
Document classification
Document classification accuracy

Taught by

Graham Neubig

Reviews

Start your review of CMU Neural Nets for NLP 2018 - Document-Level Models

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.