Completed
Hidden Markov Models w/ Gaussian Emissions . Instead of parameterizing each state with a categorical distribution, we can use a Gaussian (or Gaussian mixture)!
Class Central Classrooms beta
YouTube videos curated by Class Central.
Classroom Contents
Neural Nets for NLP 2019 - Unsupervised and Semi-supervised Learning of Structure
Automatically move to the next video in the Classroom when playback concludes
- 1 Intro
- 2 Supervised, Unsupervised, Semi-supervised
- 3 Learning Features vs. Learning Discrete Structure
- 4 Unsupervised Feature Learning (Review)
- 5 How do we Use Learned Features?
- 6 What About Discrete Structure?
- 7 What is our Objective?
- 8 A Simple First Attempt
- 9 Hidden Markov Models w/ Gaussian Emissions . Instead of parameterizing each state with a categorical distribution, we can use a Gaussian (or Gaussian mixture)!
- 10 Problem: Embeddings May Not be Indicative of Syntax
- 11 Normalizing Flow (Rezende and Mohamed 2015)
- 12 Soft vs. Hard Tree Structure
- 13 One Other Paradigm: Weak Supervision
- 14 Gated Convolution (Cho et al. 2014)
- 15 Learning with RL (Yogatama et al. 2016)
- 16 Difficulties in Learning Latent Structure (Wiliams et al. 2018)
- 17 Phrase Structure vs. Dependency Structure
- 18 Learning Dependency Heads w/ Attention (Kuncoro et al. 2017)