Neural Nets for NLP 2021 - Document-Level Models

Neural Nets for NLP 2021 - Document-Level Models

Graham Neubig via YouTube Direct link

Some NLP Tasks we've Handled

1 of 23

1 of 23

Some NLP Tasks we've Handled

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP 2021 - Document-Level Models

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Some NLP Tasks we've Handled
  2. 2 Some Connections to Tasks over Documents
  3. 3 Document Level Language Modeling
  4. 4 Remember: Modeling using Recurrent Networks
  5. 5 Simple: Infinitely Pass State
  6. 6 Separate Encoding for Coarse- grained Document Context
  7. 7 Self-attention/Transformers Across Sentences
  8. 8 Transformer-XL: Truncated BPTT+Transformer
  9. 9 Adaptive Span Transformers
  10. 10 Reformer: Efficient Adaptively Sparse Attention
  11. 11 How to Evaluate Document- level Models?
  12. 12 Document Problems: Entity Coreference
  13. 13 Mention(Noun Phrase) Detection
  14. 14 Components of a Coreference Model
  15. 15 Coreference Models:Instances
  16. 16 Mention Pair Models
  17. 17 Entity Models: Entity-Mention Models
  18. 18 Advantages of Neural Network Models for Coreference
  19. 19 End-to-End Neural Coreference (Span Model)
  20. 20 End-to-End Neural Coreference (Coreference Model)
  21. 21 Using Coreference in Neural Models
  22. 22 Discourse Parsing w/ Attention- based Hierarchical Neural Networks
  23. 23 Uses of Discourse Structure in Neural Models

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.