Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Neural Nets for NLP 2020 - Neural Nets + Knowledge Bases

Graham Neubig via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore neural network techniques for learning from and incorporating knowledge bases in natural language processing. Delve into methods for extracting knowledge from neural embeddings and integrating structured knowledge into neural models. Cover topics such as decomposable relation models, multi-hop relational context with graph neural networks, relation extraction using neural tensor networks, distant supervision for relation classification, and retrofitting embeddings to existing lexicons. Examine approaches to handle knowledge base incompleteness and model distant supervision noise in neural models. Learn about reasoning over text corpora as knowledge bases and jointly modeling knowledge base relations and text.

Syllabus

Intro
Knowledge Bases . Structured databases of knowledge usually containing
WordNet (Miller 1995)
Decomposable Relation Model (Xie et al. 2017) • Idea: There are many relations, but each can be represented by a limited number of concepts • Method: Treat each relation map as a mixture of concepts, with sparse mixture vector a
Multi-hop Relational Context w/ Graph Neural Networks (Schlichtbruil et al., 2017)
Knowledge Base Incompleteness
Relation Extraction w/ Neural Tensor Networks (Socher et al. 2013)
Distant Supervision for Relation Extraction (Mintz et al. 2009)
Relation Classification w/ CNNS (Zeng et al. 2014)
Jointly Modeling KB Relations and Text (Toutanova et al. 2015) To model textual links between words w neural net: aggregate over multiple instances of links independency tree
Modeling Distant Supervision Noise in Neural Models (Luo et al. 2017) • Idea: there is noise in distant Supervision labels, so we want to model it
Retrofitting of Embeddings to Existing Lexicons (Faruqui et al. 2015)
Reasoning over Text Corpus as a Knowledge Base (Dhingra et al. 2020)

Taught by

Graham Neubig

Reviews

Start your review of Neural Nets for NLP 2020 - Neural Nets + Knowledge Bases

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.