Overview
Explore the philosophical foundations and scientific nature of deep learning in this thought-provoking lecture by renowned AI researcher Yann LeCun. Delve into the epistemology of deep learning, examining whether it's more akin to alchemy or science. Trace the historical development of neural networks, from their biological inspiration to the modern era of deep learning. Investigate the interplay between theory and invention in the field, and understand how biological systems have influenced artificial neural network architectures. Analyze the standard paradigm of pattern recognition, the impact of the neural net winter, and the resurgence of multilayer neural networks. Gain insights into the inspiration behind convolutional neural networks and their connection to the visual cortex. Evaluate the role of learning theory in deep learning and extract valuable lessons from the field's evolution. Conclude with a critical examination of Support Vector Machines (SVMs) and their place in the broader context of machine learning.
Syllabus
Intro
DL: Engineering Science or Natural Science?
Theory often Follows Invention
Inspiration for DL: The Brain!
The Standard Paradigm of Pattern Recognition
1969-1985: Neural Net Winter
Biological Inspiration?
Theory is Good, Because it Makes Empiricism Efficient
Multilayer Neural Nets and Deep Learning
Inspiration for ConvNets: The Visual Cortex!
What About Learning Theory?
Lessons learned
What's an SVM, really?
Taught by
Institute for Advanced Study