Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Natural Language Supervision for Deep Learning - Toward Language-Guided Training

Neurosymbolic Programming for Science via YouTube

Overview

Explore the potential of natural language supervision in machine learning through Jacob Andreas' talk at the Neurosymbolic Programming for Science conference. Delve into the contrast between traditional example-based learning and language-guided approaches in deep networks. Discover how human learners acquire concepts and skills through richer, language-based supervision. Examine recent successes in natural language processing and the challenges of applying language-based training to broader learning problems. Learn about cutting-edge results in using natural language to guide search and library learning in inductive program synthesis. Investigate the connections between these approaches and human concept learning. The talk covers topics such as psychology experiments, simple tasks, program synthesis, primitives, training time regularization, and natural language integration in machine learning processes.

Syllabus

Intro
How people learn
Psychology experiment
Learning from language
Simple tasks
Training
Program Synthesis
Summary
Audience Questions
Primitives
Under the Hood
Training Time Regularization
Training Time Natural Language

Taught by

Neurosymbolic Programming for Science

Reviews

Start your review of Natural Language Supervision for Deep Learning - Toward Language-Guided Training

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.