Neural Nets for NLP 2019 - Unsupervised and Semi-supervised Learning of Structure

Neural Nets for NLP 2019 - Unsupervised and Semi-supervised Learning of Structure

Graham Neubig via YouTube Direct link

Soft vs. Hard Tree Structure

12 of 18

12 of 18

Soft vs. Hard Tree Structure

Class Central Classrooms beta

YouTube videos curated by Class Central.

Classroom Contents

Neural Nets for NLP 2019 - Unsupervised and Semi-supervised Learning of Structure

Automatically move to the next video in the Classroom when playback concludes

  1. 1 Intro
  2. 2 Supervised, Unsupervised, Semi-supervised
  3. 3 Learning Features vs. Learning Discrete Structure
  4. 4 Unsupervised Feature Learning (Review)
  5. 5 How do we Use Learned Features?
  6. 6 What About Discrete Structure?
  7. 7 What is our Objective?
  8. 8 A Simple First Attempt
  9. 9 Hidden Markov Models w/ Gaussian Emissions . Instead of parameterizing each state with a categorical distribution, we can use a Gaussian (or Gaussian mixture)!
  10. 10 Problem: Embeddings May Not be Indicative of Syntax
  11. 11 Normalizing Flow (Rezende and Mohamed 2015)
  12. 12 Soft vs. Hard Tree Structure
  13. 13 One Other Paradigm: Weak Supervision
  14. 14 Gated Convolution (Cho et al. 2014)
  15. 15 Learning with RL (Yogatama et al. 2016)
  16. 16 Difficulties in Learning Latent Structure (Wiliams et al. 2018)
  17. 17 Phrase Structure vs. Dependency Structure
  18. 18 Learning Dependency Heads w/ Attention (Kuncoro et al. 2017)

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.