Neural Nets for NLP 2017 - Recurrent Neural Networks

Neural Nets for NLP 2017 - Recurrent Neural Networks

Graham Neubig via YouTube Direct link

LSTM Structure

18 of 18

18 of 18

LSTM Structure

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP 2017 - Recurrent Neural Networks

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 NLP and Sequential Data
  3. 3 Long-distance Dependencies in Language
  4. 4 Parameter Tying
  5. 5 What Can RNNs Do?
  6. 6 e.g. Language Modeling
  7. 7 Representing Sentences
  8. 8 Representing Contexts
  9. 9 Recurrent Neural Networks in DyNet
  10. 10 Parameter Initialization
  11. 11 Sentence Initialization
  12. 12 A Solution: Long Short-term Memory (Hochreiter and Schmichuber 1997)
  13. 13 Other Alternatives
  14. 14 Handling Mini-batching
  15. 15 Mini-batching Method
  16. 16 Handling Long Sequences
  17. 17 Example: LM - Sentence Classifier
  18. 18 LSTM Structure

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.