CMU Neural Nets for NLP: Model Interpretation

CMU Neural Nets for NLP: Model Interpretation

Graham Neubig via YouTube Direct link

Why neural translations are the right length?

6 of 13

6 of 13

Why neural translations are the right length?

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

CMU Neural Nets for NLP: Model Interpretation

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Why interpretability?
  3. 3 What is interpretability?
  4. 4 Two broad themes
  5. 5 Source Syntax in NMT
  6. 6 Why neural translations are the right length?
  7. 7 Fine grained analysis of sentence embeddings
  8. 8 What you can cram into a single vector: Probing sentence embeddings for linguistic properties
  9. 9 Issues with probing
  10. 10 Minimum Description Length (MDL) Probes
  11. 11 How to evaluate?
  12. 12 Explanation Techniques: gradient based importance scores
  13. 13 Explanation Technique: Extractive Rationale Generation

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.