Neural Nets for NLP 2018 - Neural Semantic Parsing

Neural Nets for NLP 2018 - Neural Semantic Parsing

Graham Neubig via YouTube Direct link

Tree Structures of Syntax

1 of 19

1 of 19

Tree Structures of Syntax

Class Central Classrooms beta

YouTube playlists curated by Class Central.

Classroom Contents

Neural Nets for NLP 2018 - Neural Semantic Parsing

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Tree Structures of Syntax
  2. 2 Representations of Semantics
  3. 3 Meaning Representations
  4. 4 Example Special-purpose Representations
  5. 5 Example Query Tasks
  6. 6 Example Command and Control Tasks
  7. 7 Example Code Generation Tasks
  8. 8 A Better Attempt: Tree-based Parsing Models • Generate from top-down using hierarchical sequence- to-sequence model (Dong and Lapata 2016)
  9. 9 Code Generation: Handling Syntax • Code also has syntax, e.g. in form of Abstract Syntax Trees
  10. 10 Problem w/ Weakly Supervised Learning: Spurious Logical Forms . Sometimes you can get the right answer without actually doing the generalizable thing (Guu et al. 2017)
  11. 11 Meaning Representation Desiderata (Jurafsky and Martin 17.1)
  12. 12 First-order Logic
  13. 13 Abstract Meaning Representation (Banarescu et al. 2013)
  14. 14 Other Formalisms
  15. 15 Parsing to Graph Structures
  16. 16 Linearization for Graph Structures (Konstas et al. 2017)
  17. 17 CCG and CCG Parsing
  18. 18 Neural Module Networks: Soft Syntax-driven Semantics (Andreas et al. 2016) . Standard syntax semantic interfaces use symbolic representations . It is also possible to use syntax to guide structure of…
  19. 19 Neural Models for Semantic Role Labeling . Simple model w/ deep highway LSTM tagger works well (Le et al. 2017)

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.