Neural Nets for NLP 2017 - Neural Semantic Parsing

Neural Nets for NLP 2017 - Neural Semantic Parsing

Graham Neubig via YouTube Direct link

CCG and CCG Parsing

13 of 16

13 of 16

CCG and CCG Parsing

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP 2017 - Neural Semantic Parsing

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Tree Structures of Syntax
  3. 3 Representations of Semantics
  4. 4 Meaning Representations
  5. 5 Example Special-purpose Representations
  6. 6 Example Query Tasks
  7. 7 A First Attempt: Sequence-to- sequence Models (Jia and Liang 2016)
  8. 8 A Better Attempt: Tree-based Parsing Models • Generate from top-down using hierarchical sequence- to-sequence model (Dong and Lapata 2016)
  9. 9 Meaning Representation Desiderata (Jurafsky and Martin 17.1)
  10. 10 First-order Logic
  11. 11 Abstract Meaning Representation (Banarescu et al. 2013)
  12. 12 Syntax-driven Semantic Parsing
  13. 13 CCG and CCG Parsing
  14. 14 Parsing to Graph Structures
  15. 15 Semantic Role Labeling (Gildea and Jurafsky 2002)
  16. 16 Neural Models for Semantic Role Labeling

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.