Neural Nets for NLP 2020 - Generating Trees Incrementally

Neural Nets for NLP 2020 - Generating Trees Incrementally

Graham Neubig via YouTube Direct link

Recursive Neural Networks (Socher et al. 2011)

9 of 13

9 of 13

Recursive Neural Networks (Socher et al. 2011)

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP 2020 - Generating Trees Incrementally

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Two Common Types of Linguistic Structure
  3. 3 Semantic Parsing: Another Representative Tree Generation Task
  4. 4 Shift Reduce Example
  5. 5 Classification for Shift-reduce
  6. 6 Making Classification Decisions
  7. 7 What Features to Extract?
  8. 8 Why Tree Structure?
  9. 9 Recursive Neural Networks (Socher et al. 2011)
  10. 10 Why Linguistic Structure?
  11. 11 Clarification about Meaning Representations (MRS) Machine-executable MRs (our focus today) executable programs to accomplish a task MRs for Semantic Annotation capture the semantics of natural langua…
  12. 12 Core Research Question for Better Models How to add inductive blases to networks a to better capture the structure of programs?
  13. 13 Summary: Supervised Learning of Semantic Parsers Key Question design decoders to follow the structure of programs

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.