Overview
Save Big on Coursera Plus. 7,000+ courses at $160 off. Limited Time Only!
Explore a 32-minute video analyzing Geoffrey Hinton's research on biologically plausible backpropagation in the brain. Delve into the challenges of understanding synaptic modifications in multilayered cortical networks and how backpropagation in artificial neural networks might offer insights into cortical learning. Examine the role of feedback connections in delivering error signals and inducing neural activities that approximate these signals. Cover key concepts such as synaptic symmetry, error signals, spiking rates, and the end-grad hypothesis. Investigate autoencoders, single-layer autoencoders, feedforward functions, and approximate inverses to gain a comprehensive understanding of this cutting-edge neuroscience research.
Syllabus
Introduction
Neural networks learn
Feedback
synaptic symmetry
error signals
spiking rates
end grad hypothesis
autoencoders
single layer autoencoders
feedforward function
approximate inverse
Taught by
Yannic Kilcher