Overview
Dive into the fundamentals of deep learning with this comprehensive tutorial from the MIT BMM Summer Course 2018. Led by Eugenio Piasini and Yen-Ling Kuo, explore key concepts such as supervised learning, multilayer perceptrons, and the rationale behind deep networks. Gain insights into back propagation, quadratic loss, and the historical perspective of neural networks. Delve into convolutional neural networks, understanding nonlinearity, pooling, and their evolutionary steps. Discover various applications of deep learning, address challenges like gradient explosion, and learn about recurrent units and training paradigms. This 68-minute session provides a solid foundation for understanding and implementing deep learning techniques.
Syllabus
Introduction
Supervised Learning
Multilayer Perception
Why Deep Networks
Back Propagation
Quadratic Loss
Historical Perspective
Convolution
Example
Nonlinearity
Pooling
Evolutionary Steps
Applications
Gradient Explosion
Recurrent Unit
Training Paradigm
Taught by
MITCBMM