Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

YouTube

How Brain Computations Can Inspire New Paths in AI - Part 2

MITCBMM via YouTube

Overview

Explore how brain computations can inspire new paths in artificial intelligence in this lecture by Gabriel Kreiman from Harvard University and Children's Hospital Boston. Delve into current computational models and their limitations, examining topics such as occluded objects, backward masking, and limiting presentation time. Analyze observations and interpretations at the neurophysiological level, including individual trials and computational models like RNN. Investigate object recognition, minimal context, and contextual reasoning, while evaluating model performance. Examine computer graphics, adversary images, and the challenges of understanding humor in images. Gain insights into the intersection of neuroscience and AI, uncovering potential avenues for advancing machine learning algorithms inspired by human brain function.

Syllabus

Intro
What current computational models capture
Occluded objects
Bubbles
Backward masking
Limiting presentation time
Observations
Interpretation
Neurophysiological level
Individual trials
Computational model
RNNH
Unfolding and Folding
Object recognition
Minimal context
Contextual reasoning
Model performance
Computer graphics
Paper picks can fly
Adversary images
Understanding an image
Predicting humor

Taught by

MITCBMM

Reviews

Start your review of How Brain Computations Can Inspire New Paths in AI - Part 2

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.