Neural Nets for NLP 2019 - Attention

Neural Nets for NLP 2019 - Attention

Graham Neubig via YouTube Direct link

Intro

1 of 18

1 of 18

Intro

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP 2019 - Attention

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Encoder-decoder Models
  3. 3 Sentence Representations
  4. 4 Basic Idea (Bahdanau et al. 2015)
  5. 5 Calculating Attention (2)
  6. 6 A Graphical Example
  7. 7 Attention Score Functions (1)
  8. 8 Input Sentence
  9. 9 Hierarchical Structures (Yang et al. 2016)
  10. 10 Multiple Sources
  11. 11 Coverage • Problem: Neural models tends to drop or repeat
  12. 12 Incorporating Markov Properties (Cohn et al. 2015)
  13. 13 در Bidirectional Training
  14. 14 Hard Attention
  15. 15 Summary of the Transformer (Vaswani et al. 2017)
  16. 16 Attention Tricks
  17. 17 Training Tricks
  18. 18 Masking for Training . We want to perform training in as few operations as

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.