Overview
Dive into the foundations of deep learning with this comprehensive lecture from MIT's Introduction to Deep Learning course. Explore key concepts including perceptrons, neural networks, loss functions, gradient descent, and backpropagation. Learn about crucial techniques like setting learning rates, batched gradient descent, and regularization methods such as dropout and early stopping. Gain insights into why deep learning is transforming various fields and how to apply neural networks effectively. Access additional course materials, including slides and lab exercises, through the provided link. Stay updated on the latest developments in deep learning at MIT by following their social media channels.
Syllabus
​ - Introduction
​ - Course information
​ - Why deep learning?
​ - The perceptron
​ - Perceptron example
​ - Applying neural networks
​ - Loss functions
​ - Training and gradient descent
​ - Backpropagation
​ - Setting the learning rate
​ - Batched gradient descent
​ - Regularization: dropout and early stopping
- Summary
Taught by
https://www.youtube.com/@AAmini/videos