Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

Priors for Semantic Variables - Yoshua Bengio

Institute for Advanced Study via YouTube

Overview

Explore the frontiers of machine learning in this seminar on Theoretical Machine Learning, featuring renowned researcher Yoshua Bengio from Université de Montréal. Delve into the concept of priors for semantic variables and examine the limitations of current machine learning approaches. Gain insights into systematization, learning theory, and the role of conscious processing in AI. Investigate the differences between System 1 and System 2 thinking, and explore knowledge representation, attention mechanisms, and the Global Workspace Theory. Discover the importance of causality, the Independent Mechanism Hypothesis, and localized changes in AI systems. Learn about parameterization, multitask learning, and modular recurrent networks as Bengio shares cutting-edge ideas on advancing machine learning capabilities.

Syllabus

Introduction
Limitations of machine learning
Systematization
Learning theory
Conscious processing
Agency
System 1 vs System 2
The Kind of Knowledge
Knowledge Representation
Attention
Recurrent Independent Mechanisms
Global Workspace Theory
Attention Mechanisms
Causality
Independent Mechanism Hypothesis
Localized Changes
Parameterization
Multitask learning
Modular recurrent net

Taught by

Institute for Advanced Study

Reviews

Start your review of Priors for Semantic Variables - Yoshua Bengio

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.