Overview
Dive into the foundations of deep learning with this comprehensive lecture from MIT's Introduction to Deep Learning course. Explore key concepts including perceptrons, neural networks, activation functions, loss functions, gradient descent, and backpropagation. Learn about practical aspects of training neural networks, such as setting learning rates, using batched gradient descent, and applying regularization techniques like dropout and early stopping. Gain a solid understanding of why deep learning is powerful and how it can be applied to solve complex problems.
Syllabus
​ - Introduction
​ - Course information
​ - Why deep learning?
​ - The perceptron
​ - Activation functions
​ - Perceptron example
​ - From perceptrons to neural networks
​ - Applying neural networks
​ - Loss functions
​ - Training and gradient descent
​ - Backpropagation
​ - Setting the learning rate
​ - Batched gradient descent
​ - Regularization: dropout and early stopping
​ - Summary
Taught by
https://www.youtube.com/@AAmini/videos