Neural Nets for NLP: Recurrent Neural Networks

Neural Nets for NLP: Recurrent Neural Networks

Graham Neubig via YouTube Direct link

Parameter Tying

7 of 20

7 of 20

Parameter Tying

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP: Recurrent Neural Networks

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 NLP and Sequential Data
  3. 3 Long-distance Dependencies in Language
  4. 4 Can be Complicated!
  5. 5 Unrolling in Time
  6. 6 Training RNNS
  7. 7 Parameter Tying
  8. 8 What Can RNNs Do?
  9. 9 Representing Sentences
  10. 10 Representing Contexts
  11. 11 e.g. Language Modeling
  12. 12 RNNLM Example: Loss Calculation and State Update
  13. 13 LSTM Structure
  14. 14 Other Alternatives
  15. 15 Handling Mini-batching
  16. 16 Mini-batching Method
  17. 17 Bucketing/Sorting
  18. 18 Handling Long Sequences
  19. 19 RNN Strengths/Weaknesses
  20. 20 Pre-training/Transfer

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.