Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP 2021 - Neural Nets + Knowledge Bases

Graham Neubig via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore neural networks and knowledge bases in this comprehensive lecture from CMU's Neural Networks for NLP course. Dive into methods for learning knowledge bases from neural embeddings and incorporating them into neural models. Discover techniques for probing language models for knowledge. Learn about WordNet, knowledge graph embeddings, relation extraction, distant supervision, retrofitting embeddings, and open information extraction. Examine the differences between modeling word embeddings and modeling relations. Investigate P-tuning for direct optimization of embeddings and compare nonparametric and parametric models in this informative 44-minute session led by Graham Neubig and Zhengbao Jiang.

Syllabus

Intro
Knowledge Bases
WordNet (Miller 1995)
Learning Knowledge Graph Embeddings (Bordes et al. 2013)
Remember: Consistency in Embeddings
Relation Extraction w/ Neural Tensor Networks (Socher et al. 2013)
Distant Supervision for Relation Extraction (Mintz et al. 2009)
Jointly Modeling KB Relations and Text (Toutanova et al. 2015)
Modeling Distant Supervision Noise in Neural Models (Lug et al. 2017)
Retrofitting of Embeddings to Existing Lexicons (Faruqui et al. 2015)
Open Information Extraction (Banko et al 2007)
Neural Models for Open IE
Modeling Word Embeddings vs. Modeling Relations
P-tuning: Directly Optimize Embeddings (Liu et al. 2021)
Nonparametric Models Outperform Parametric Models

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP 2021 - Neural Nets + Knowledge Bases

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.