Neural Nets for NLP 2017 - Parsing With Dynamic Programs

Neural Nets for NLP 2017 - Parsing With Dynamic Programs

Graham Neubig via YouTube Direct link

Recursive Neural Networks

32 of 35

32 of 35

Recursive Neural Networks

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP 2017 - Parsing With Dynamic Programs

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Introduction
  2. 2 Linguistic Structure
  3. 3 Dynamic Programming Based Models
  4. 4 Minimum Spanning Tree
  5. 5 Graph Based vs Transition Based
  6. 6 Tulio Edmunds Algorithm
  7. 7 IceNurse Algorithm
  8. 8 Quiz
  9. 9 Before Neural Nets
  10. 10 Higher Order Dependency Parsing
  11. 11 Neural Models
  12. 12 Motivation
  13. 13 Model
  14. 14 Example
  15. 15 Global probabilistic training
  16. 16 Code example
  17. 17 Algorithms
  18. 18 Phrase Structures
  19. 19 Parsing vs Tagging
  20. 20 Hyper Graph Edges
  21. 21 Scoring Edges
  22. 22 CKY Algorithm
  23. 23 Viterbi Algorithm
  24. 24 Over Graphs
  25. 25 CRF
  26. 26 CRF Example
  27. 27 CRF Over Trees
  28. 28 Neural CRF
  29. 29 Inference
  30. 30 Parsing
  31. 31 Structured Inference
  32. 32 Recursive Neural Networks
  33. 33 Rear Inking
  34. 34 Rear Inking Results
  35. 35 Next Time

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.