Neural Nets for NLP 2020 - Document Level Models

Neural Nets for NLP 2020 - Document Level Models

Graham Neubig via YouTube Direct link

Remember: Modeling using Recurrent Networks

4 of 21

4 of 21

Remember: Modeling using Recurrent Networks

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP 2020 - Document Level Models

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Some NLP Tasks we've Handled
  2. 2 Some Connections to Tasks over Documents
  3. 3 Document Level Language Modeling
  4. 4 Remember: Modeling using Recurrent Networks
  5. 5 Separate Encoding for Coarse- grained Document Context (Mikolov & Zweig 2012)
  6. 6 What Context to Incorporate?
  7. 7 Self-attention/Transformers Across Sentences
  8. 8 Document Problems: Entity Coreference
  9. 9 Mention(Noun Phrase) Detection
  10. 10 Components of a Coreference Model
  11. 11 Coreference Models:Instances
  12. 12 Mention Pair Models
  13. 13 Entity Models: Entity-Mention Models
  14. 14 Advantages of Neural Network Models for Coreference
  15. 15 Coreference Resolution w/ Entity- Level Distributed Representations
  16. 16 Deep Reinforcement Learning for Mention-Ranking Coreference Models
  17. 17 End-to-End Neural Coreference (Span Model)
  18. 18 End-to-End Neural Coreference (Coreference Model)
  19. 19 Using Coreference in Neural Models
  20. 20 Discourse Parsing w/ Attention- based Hierarchical Neural Networks
  21. 21 Uses of Discourse Structure

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.