Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Natural Language Processing - Sam Bowman (NYU)

Paul G. Allen School via YouTube

Overview

Explore the latest advancements in general-purpose language understanding models through this insightful lecture by Sam Bowman from NYU. Delve into the GLUE and SuperGLUE shared-task benchmarks, examining their role in measuring progress towards building versatile pretrained neural network models for language comprehension. Gain valuable insights into the motivations behind these benchmarks and their implications for recent NLP developments. Engage with thought-provoking questions about future progress measurement in this field. Learn about Sam Bowman's background and his significant contributions to NLP research, including his focus on data, evaluation techniques, and modeling for sentence and paragraph understanding. Discover the lecture's comprehensive syllabus, covering topics such as the Recognizing Textual Entailment Challenge, the Winograd Schema Challenge, and the inner workings of BERT. Enhance your understanding of natural language processing and its evolving landscape through this captivating hour-long presentation.

Syllabus

Intro
The Goal
The Technique: Muppets
The Recognizing Textual Entailment Challenge
The Winograd Schema Challenge
Human Performance Estimate
The Commitment Bank
SuperGLUE: The Main Tasks
SuperGLUE Score: Highlights
A GLUE and SuperGLUE: Limitations
What's inside BERT?
Case Study: NPI Licensing
Let's teach the model to judge acceptability.
Evaluation: What's Next?

Taught by

Paul G. Allen School

Reviews

Start your review of Natural Language Processing - Sam Bowman (NYU)

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.