Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Structure-Sensitive Dependency Learning in Recurrent Neural Networks - 2017

Center for Language & Speech Processing(CLSP), JHU via YouTube

Overview

Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore the capabilities of recurrent neural networks (RNNs) in learning structure-sensitive dependencies from natural language corpora, focusing on English subject-verb number agreement. Delve into Tal Linzen's research examining LSTMs' ability to predict verb number in various sentence types, analyzing their internal representations and comparing their performance to human agreement attraction errors. Discover how the networks approximate syntactic structure in common sentences but struggle with complex constructions, highlighting the need for stronger inductive biases. Learn about the potential of multi-task learning to address these limitations and gain insights into using linguistic and psycholinguistic methods to evaluate "black-box" neural network models. This hour-long lecture, delivered by Assistant Professor Tal Linzen from Johns Hopkins University, offers valuable perspectives on the intersection of cognitive science, linguistics, and artificial intelligence in natural language processing.

Syllabus

Structure-Sensitive Dependency Learning in Recurrent Neural Networks -- Tal Linzen (JHU) - 2017

Taught by

Center for Language & Speech Processing(CLSP), JHU

Reviews

Start your review of Structure-Sensitive Dependency Learning in Recurrent Neural Networks - 2017

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.