Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP 2020 - Unsupervised and Semi-supervised Learning of Structure

Graham Neubig via YouTube

Overview

Explore a comprehensive lecture on unsupervised and semi-supervised learning of structure in neural networks for natural language processing. Delve into the distinctions between learning features and learning structure, examine various semi-supervised and unsupervised learning methods, and understand key design decisions for unsupervised models. Gain insights into practical examples of unsupervised learning, including cross-lingual applications and the challenges of learning latent structure. Learn about advanced concepts such as normalizing flow, gated convolution, and reinforcement learning approaches in the context of NLP. Discover how to leverage weak supervision and navigate the complexities of soft vs. hard tree structures in neural network architectures.

Syllabus

Supervised, Unsupervised, Semi-supervised
Learning Features vs. Learning Discrete Structure
Unsupervised Feature Learning (Review)
How do we Use Learned Features?
What About Discrete Structure?
What is our Objective?
A Simple First Attempt
Problem: Embeddings May Not be Indicative of Syntax
Normalizing Flow (Rezende and Mohamed 2015)
Cross-lingual Application of Unsupervised Models (He et al. 2019)
Soft vs. Hard Tree Structure
One Other Paradigm: Weak Supervision
Gated Convolution (Cho et al. 2014)
Learning with RL (Yogatama et al. 2016)
Difficulties in Learning Latent Structure

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP 2020 - Unsupervised and Semi-supervised Learning of Structure

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.